Evaluation of a laser-based reference system for ADAS
|
|
- Tyler Harrell
- 5 years ago
- Views:
Transcription
1 23 rd ITS World Congress, Melbourne, Australia, October 2016 Paper number ITS- EU-TP0045 Evaluation of a laser-based reference system for ADAS N. Steinhardt 1*, S. Kaufmann 2, S. Rebhan 1, U. Lages 2, Y. Noutangnin 1, C. Goerick 1, C. Ahlers 2 1. Honda Research Institute Europe (HRI-EU) GmbH, Carl-Legien-Strasse 30, Offenbach, Germany Nico.Steinhardt@honda-ri.de Ibeo Automotive Systems GmbH Merkurring 60-62, Hamburg Germany stefan.kaufmann@ibeo-as.com Abstract The evaluation and verification of advanced driver assistance systems (ADAS) is a crucial element for guaranteeing safe operation of such systems on the road. To cover a wide variety of real-world situations, ground-truth data needs to be generated for large datasets, rendering complete manual annotation infeasible. However, when employing automatic ground-truth reference systems, the performance and accuracy of these systems becomes subject to validation itself. In this paper, we present a three-step methodology for characterizing a 360 degree reference system based on vehicle-mounted laser scanners, which is widely used for automatized labelling of data. Furthermore, a high-accuracy ground-truth reference system (GTRS) is presented that is used to determine the object classification and tracking performance as well as the position accuracy of the device-under-test (DuT). This methodology comprises the accuracy evaluation by conducting measurements series with increasing complexity starting with idealized objects up to complex real-world traffic situations. By doing so, the cause of accuracy changes can be pinned down to properties of the object or scene constellation allowing a deeper understanding of the performance that can be expected from the DuT using real world data. In this paper, the first step of the methodology is presented, that shows the abstraction from idealized to real traffic participants. Keywords: Ground-truth measurement, object validation, LIDAR
2 Introduction Perception of the environment is crucial for advanced driver assistance systems (ADAS). Modern vehicles are equipped with various sensors which extract information and provide those to the ADAS. The development of such sensors and related perception algorithms, as well as the development of ADAS functions, requires reference data for testing and validation. Normally, a lot of manual labelling work is necessary for creating this reference data and therefore typically only a few reference data sets are generated. However, having a large database of reference scenarios speeds up the development process and improves the perception quality of environmental sensors and algorithms. This paper presents an approach for generating reference data with a minimal amount of manual labelling work. This data will be used to evaluate the quality of the Ibeo reference system [1]. Related work can be found in [2] and [3] with a slightly different focus on methodology. The approach is based on the Ibeo reference system consisting of a laser scanner fusion system with 360 field of view which is mounted on a passenger car (Figure 1). This system generates data for dynamic objects in the vehicle environment, with relevant properties such as object position, yaw angle, velocity, yaw rate and object class. Figure 1: Ibeo Reference System Setup Using the aforementioned ground-truth reference (GTRS) data, we will describe a systematic approach for evaluating the dynamic object tracking performance of the Ibeo reference system. The reference system under evaluation is referred as DuT in the following. Methodology The core idea of the evaluation methodology is to stepwise derive the real-world accuracy of the DuT and to identify all influences contained. This is done by applying a top-down procedure, starting with static idealized objects with exactly known properties, and advancing towards more and more realistic setups. The setup of all scenarios are measured by a 2
3 ground-truth reference system (GTRS), which allows identifying both the basic DuT accuracy as well as the additional uncertainties of each step towards more realistic scenarios. For obtaining more statistical significant measurements, the measurements are taken using two different DuTs, which both are series vehicles equipped with the Ibeo laser-based reference system. The approach is organized in 3 phases: Phase 1: Identification of base DuT accuracy, taking the step from idealized static objects to real static objects, as well as from real static objects to real dynamic objects. Creation of statistics for object classification, object position/velocity accuracy. Measurements on test tracks with exactly known geometric properties. Phase 2: Application of the statistics to predefined traffic situations on real roads, taking the step from idealized traffic situations to real traffic situations on highways and rural roads. Creation of statistics for scenario detection (e.g., cut in, cut out and overtaking). Phase 3: Application of the statistics from Phases 1 and 2 for taking the step to complex urban traffic situations. Creation of statistics for complex scenario understanding. In the scope of this paper, only work of Phase 1 will be presented. The evaluation of the DuT is done starting from idealized objects to real objects. The following static and dynamic scenarios are planned for Phase 1 and described in detail in the following sections: 1. Setup of static scenarios using object boxes (rectangular cubes) with the same maximum dimensions as real traffic participants (i.e., pedestrian, motorbike and car). These tests will provide statistics for the maximum achievable static position accuracy of the DuT on targets with precisely known geometric properties. These tests will be conducted from different distances, covering the entire detection range (i.e., from short range up to the maximum detection range for the DuT). 2. Repetition of all step 1 measurements using the corresponding real traffic participants. This test will provide statistics for the influences of a realistic vehicle shape on the static position accuracy of the DuT compared to idealized targets, and therefore perform the first step from ideal to real objects. 3. Setup of dynamic traffic situations using the same targets as in step 1 and 2 on a test track with predefined and geometrically known road setup (e.g., oncoming traffic and intersections). The measurements will include both static, constantly moving and accelerating / braking targets, by this taking the second step from static to dynamic objects. These tests will provide statistics about velocity accuracy and dynamic position accuracy of the DuT. 4. All of the tests are used to obtain statistics for classification reliability. 3
4 Approach for ground-truth data generation For the evaluation of the dynamic object tracking provided by the DuT, a ground truth reference system (GTRS) is used, as illustrated in Figure 2. It consists of a static, external measurement setup and provides an accuracy of 20mm for the evaluation of the DuT data. The accuracy calculation includes the GTRS accuracy of 7mm (noise average) and accuracy of used measurement equipment to measure GTRS relative positions and box positions to ground truth (Disto Leica D5 with 3mm accuracy and a meter stick with an estimated accuracy of 10mm). The GTRS consists of two long-range distance sensors (SICK DML40-2) and two laptops for recording the data. In addition, video streams are recorded for scene understanding. As the test setups require different positioning of the long-range scanners (parallel setup, orthogonal setup), the relative position of the scanners is determined using geometric methods (e.g. triangulation) prior to the measurements. The accuracy estimation of the different GTRS setups will be shown later on in the corresponding chapters. In addition, Garmin GPS receivers are used for time synchronization of the GTRS as well as in the DuTs. This ensures that the local timestamps will be synchronized to UTC, so that the measurements will all be taken in the same timeframe and can be easily synchronized in the data evaluation phase. The GPS position will not be evaluated, as its accuracy is even below the expected DuT accuracy. For better comparability of the data, all measurements from the GTRS as well as from the DUT are projected to the same reference point: the center of the rear axle of any vehicle. Known measurement offsets (e.g. mounting position of the triple reflector or size of the bumper), viewing direction onto the vehicle (front, back) and their dynamic changes are also compensated. Garmin GPS Data recording Time synchronization Cameras Data recording Laptops Figure 2: Ground Truth Reference System (GTRS) Step 1: Static position accuracy for ideal objects Following the aforementioned methodology, first tests are performed to obtain the object position accuracy of the DuT for ideal objects. The DuT can classify six kinds of object types: pedestrian, motorbike, car, truck, unknown small and unknown big. Since the object classification is strongly coupled to the size of the object, object boxes have been used with 4
5 different sizes for a motorbike and a car. As shown in Figure 3, a Honda CBF1000 motorcycle (2.165m x 0.732m, incl. rider) and a Honda Civic (4.285 x 1.770m) have been used as real objects, and the idealized boxes have been designed accordingly. For the pedestrian a cylinder shaped tube (h=2.0m, d=0.6m) has been used. The truck instead has been chosen to already have a shape close to a box. By this, the creation of a very large and unhandy box has been avoided. Figure 3: Setup of static position accuracy test with ideal object boxes The static position accuracy tests have been performed with different distances and orientations of the ideal boxes. A list of all the tests performed on the ideal boxes is given in Table 1. 5
6 Table 1: Test scenarios for static position accuracy tests Object orientation DuT Position 7m 20m 50m 100m Pedestrian Motorbike Motorbike Car Car Truck Truck Measurement Principle The measurement principle for obtaining statistics about the static position accuracy of the DuT is indicated in Figure 5. Two GTRS scanners have been used to get distance information of the DuT relative to the scanners. The geometrical uncertainty of the GTRS in this measurement setup is 20mm as described above. Two triple glass reflectors have been diagonally attached to the DuT for obtaining two well-defined points to be measured by the GTRS (see Figure 4), so both the position as well as the heading angle of the DuT relative to the objects can be determined by measurement triangulation. All other distances shown in the figure (e.g., the distance between the GTRS scanners and the distances to the object) have been measured by using a hand held laser distance measurement device. As listed in Table 1, the setup was repeated for different object classes, different object orientations and different distances (d_dut) to be able to obtain statistics. Figure 4: Glass triple reflector mounted on the roof of the DuT (left) and GTRS scanner mounted on a tripod (right) 6
7 d d_lf d_rf d_lr d_rr d_dut DUT Triple glass reflectors DUT Setup 1 Setup 2 Figure 5: Measurement principle for static position accuracy tests Step 2: Static position accuracy for real objects After considering object boxes, real objects have been used and all the setups of step 1 have been repeated. Step 3: Dynamic position/velocity accuracy for realistic traffic situations The first two steps were dedicated to obtain statistics for static traffic situations. As described in the methodology, the step to dynamic measurements is taken in this chapter. The values of interest are again position accuracy as well as the accuracy of the object velocity estimated by the DuT. The setup of the GTRS for longitudinal measurements is shown in Figure 6. The GTRS measured the longitudinal object position with a rate of 200 Hz. The velocity is computed using the derivative of the GTRS measurements. For the final statistical evaluations it will be additionally checked if this value requires filtering. The uncertainty of the GTRS setup is at 10mm (no meter stick used in this setup). Those measurements are used as ground truth data and will be compared to the velocity data provided by the DuT. DUT Object GTRS Figure 6: Longitudinal velocity measurements using the GTRS 7
8 Setups for longitudinal motion The setups for longitudinal motion tracking are shown in Table 2, the measurement setup is shown in Figure 6. The geometrical uncertainty of the GTRS in this measurement setup is 10mm. A subset of combinations of DuT speed and object speed were considered in this experiment. The methodology of selection targets two criteria: On the one hand, situations that are generally known to be difficult to handle for object tracking algorithms (e.g., very large and very low velocity differences, strong changes of velocity) shall be contained, on the other hand, these manoeuvres shall be measured in a context in which they are most likely to occur in real-world traffic. Therefore, the following scenarios were selected: Setups with typical inner city speeds (30 km/h, 50 km/h). In addition 100 km/h and dynamic tests (acceleration up to 100 km/h and hard braking to full stop) Setups with both high and low relative velocity between DuT and Object for overtaking scenarios Low and high relative velocities for oncoming traffic situations Negative speed values in the table indicate oncoming traffic relative to the DuT. The columns and rows with speed ranges (e.g km/h) indicate dynamic driving behaviour. Dynamic tests have been performed for different driving directions of DuT and object and for the same driving directions. In case of different driving directions, the DuT and object where passing each other with high speed on neighbouring lanes and a full braking manoeuver was done by the drivers to stop the vehicles in front of each other. After stopping, the vehicles again accelerated to high speed. Those tests were performed to obtain values for position and speed accuracy of the detected objects for high relative accelerations. For setups with the same driving direction and speed ranges, the vehicles performed an overtaking manoeuver multiple times (one vehicle accelerated, while the other was braking and vice versa). 8
9 Table 2: Overview about performed longitudinal tracking tests DuT Speed 0 km/h Object Speed km/h km/h km/h km/h Pedestrian 0 km/h km/h Motorbike km/h km/h km/h km/h km/h km/h km/h km/h Car km/h km/h km/h km/h km/h km/h km/h km/h Truck km/h km/h km/h km/h km/h km/h km/h Setups for lateral motion As mentioned before, the tracking performance for lateral motion is evaluated with a different setup since the GTRS can only track one-dimensional motion. The setup itself is illustrated in Figure 7. The geometrical uncertainty of the GTRS in this measurement setup is 10mm. The performed tests are listed in Table 3. 9
10 DUT Figure 7: Setup of the intersection scenarios Again the selection of the test setups followed the methodology of having low and high relative velocities between the DuT and the objects. For the static object tests (speed 0 km/h) the objects have been placed close to the intersection. For the other tests, the objects stopped in front of the intersection to let the DuT vehicle pass. Table 3: Overview about performed lateral tracking tests Object Speed DuT Speed 30 km/h 50 km/h 100 km/h km/h Pedestrian 5 km/h Motorbike 0 km/h km/h km/h Car 0 km/h km/h km/h Truck 0 km/h km/h An example of the lateral tracking tests approaching a truck which is stopping at the intersection is shown in Figure 8. The figure shows the recorded data from the DuT for three different time frames. 10
11 Truck Truck DuT DuT Truck DuT Figure 8: Lateral tracking test. DuT speed: 100km/h, truck 30 km/h stopping at the intersection Preliminary Experimental Results Even though the evaluation of the experimental results is still ongoing, we present some preliminary results in this section below. Figure 9: Comparison of object position measured from DuT and from GTRS The results for the setup measuring a real car (test scenario ) for large acceleration values is shown in Figure 9. As described before, both the DuT and the car were overtaking each other multiple times. As already mentioned, all measurements refer to the center of the rear axle of the vehicles. In the bottom figure, the error is plotted. The Figure shows some peaks in the position error for certain time ranges. Those peaks have different reasons: Peak 2 and 3 are corresponding to time ranges (typically a few milliseconds), where no GTRS signal was available (since the scanner did not hit the 11
12 reflector) or other disturbances were introduced the reference measurement. These parts of the measurements will not be considered for the overall performance evaluation. To identify those parts, characteristic features are used. In this case the used characteristic is the derivative of the target positions measured by the GTRS. If the reference GTRS position is changing faster than the physical capabilities of the target, the data point is marked as invalid. On the other hand, Peak 1, 4, 5 and 6 as shown in Figure 9 are corresponding to real differences between GTRS and DuT. In these cases the GTRS data are well-formed and the DuT measured a different distance to the object than the GTRS. Those are the peaks that have to be considered in the overall performance statistics later on and will be analysed to determine the accuracy of the DuT. The resulting statistic values in test scenario without the false GTRS Data Peaks 2 and 3 was m in maximum (Peak 4 in Figure 9) and m in minimum (Peak 1 in Figure 9), the mean deviation between DuT and GTRS was m and the standard deviation was m. Conclusion and Outlook A top-down methodology for evaluating a laser-based reference system has been shown. Relevant scenarios have been identified and utilized for generating data on a proving ground using two different DuTs. In the single experiment that was shown, the accuracy of the DuT would have been sufficiently high for its use as a reference system after the removal of the outliers. As soon as the complete available data is processed, more profound statistics about object tracking position and velocity accuracy will be generated from the static and dynamic scenarios shown in this paper. Additional parameters like dependencies from velocity, acceleration and distance will be identified and taken into account. These results will then be the basis for the real-world driving scenarios in project phase 2. As the measurements were conducted with well-known targets, the verified, true positive object detections alone were considered for the classification reliability. Statistics about object detection reliability will be part of project phase 3. References 1. M. Spencer, R. Katz, U. Lages (2014), Forward-Backward Object Tracking for Generation of Reference Scenarios Based on Laser Scan Data, In Proceedings 21st World Congress on ITS, Detroit 2. Wulf, O.; Nuchter, A.; Hertzberg, J.; Wagner, B., "Ground truth evaluation of large urban 6D SLAM," in Intelligent Robots and Systems, IROS IEEE/RSJ International Conference on, vol., no., pp , Oct Nov Blanco, J.L; Moreno, F.-A.; Gonzalez, J. (2009) A collection of outdoor robotic datasets with centimeter-accuracy ground truth, In Autonomous Robots 27(4)
AUTOMATED GENERATION OF VIRTUAL DRIVING SCENARIOS FROM TEST DRIVE DATA
F2014-ACD-014 AUTOMATED GENERATION OF VIRTUAL DRIVING SCENARIOS FROM TEST DRIVE DATA 1 Roy Bours (*), 1 Martijn Tideman, 2 Ulrich Lages, 2 Roman Katz, 2 Martin Spencer 1 TASS International, Rijswijk, The
More informationLaserscanner Based Cooperative Pre-Data-Fusion
Laserscanner Based Cooperative Pre-Data-Fusion 63 Laserscanner Based Cooperative Pre-Data-Fusion F. Ahlers, Ch. Stimming, Ibeo Automobile Sensor GmbH Abstract The Cooperative Pre-Data-Fusion is a novel
More information> Acoustical feedback in the form of a beep with increasing urgency with decreasing distance to an obstacle
PARKING ASSIST TESTING THE MEASURABLE DIFFERENCE. > Creation of complex 2-dimensional objects > Online distance calculations between moving and stationary objects > Creation of Automatic Points of Interest
More informationVehicle Localization. Hannah Rae Kerner 21 April 2015
Vehicle Localization Hannah Rae Kerner 21 April 2015 Spotted in Mtn View: Google Car Why precision localization? in order for a robot to follow a road, it needs to know where the road is to stay in a particular
More informationCalibration of a rotating multi-beam Lidar
The 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems October 18-22, 2010, Taipei, Taiwan Calibration of a rotating multi-beam Lidar Naveed Muhammad 1,2 and Simon Lacroix 1,2 Abstract
More informationCohda Wireless White Paper DSRC Field Trials
Cohda Wireless White Paper DSRC Field Trials Copyright Cohda Wireless Pty Ltd ABN 84 107 936 309 Cohda Wireless Pty Ltd 82-84 Melbourne Street North Adelaide, SA 5006 Australia P +61 8 8364 4719 F +61
More informationA System for Real-time Detection and Tracking of Vehicles from a Single Car-mounted Camera
A System for Real-time Detection and Tracking of Vehicles from a Single Car-mounted Camera Claudio Caraffi, Tomas Vojir, Jiri Trefny, Jan Sochman, Jiri Matas Toyota Motor Europe Center for Machine Perception,
More informationDesigning a software framework for automated driving. Dr.-Ing. Sebastian Ohl, 2017 October 12 th
Designing a software framework for automated driving Dr.-Ing. Sebastian Ohl, 2017 October 12 th Challenges Functional software architecture with open interfaces and a set of well-defined software components
More informationSIMULATION ENVIRONMENT
F2010-C-123 SIMULATION ENVIRONMENT FOR THE DEVELOPMENT OF PREDICTIVE SAFETY SYSTEMS 1 Dirndorfer, Tobias *, 1 Roth, Erwin, 1 Neumann-Cosel, Kilian von, 2 Weiss, Christian, 1 Knoll, Alois 1 TU München,
More informationSILAB A Task Oriented Driving Simulation
SILAB A Task Oriented Driving Simulation Hans-Peter Krueger, Martin Grein, Armin Kaussner, Christian Mark Center for Traffic Sciences, University of Wuerzburg Roentgenring 11 D-97070 Wuerzburg, Germany
More informationDSRC Field Trials Whitepaper
DSRC Field Trials Whitepaper August 19, 2017 www.cohdawireless.com 1 Overview Cohda has performed more than 300 Dedicated Short Range Communications (DSRC) field trials, comparing DSRC radios from multiple
More informationTeam Aware Perception System using Stereo Vision and Radar
Team Aware Perception System using Stereo Vision and Radar System Development Review 03/ 08 / 2017 Amit Agarwal Harry Golash Yihao Qian Menghan Zhang Zihao (Theo) Zhang Project Description Develop a standalone
More information3D LIDAR Point Cloud based Intersection Recognition for Autonomous Driving
3D LIDAR Point Cloud based Intersection Recognition for Autonomous Driving Quanwen Zhu, Long Chen, Qingquan Li, Ming Li, Andreas Nüchter and Jian Wang Abstract Finding road intersections in advance is
More informationAdvanced Driver Assistance: Modular Image Sensor Concept
Vision Advanced Driver Assistance: Modular Image Sensor Concept Supplying value. Integrated Passive and Active Safety Systems Active Safety Passive Safety Scope Reduction of accident probability Get ready
More informationW4. Perception & Situation Awareness & Decision making
W4. Perception & Situation Awareness & Decision making Robot Perception for Dynamic environments: Outline & DP-Grids concept Dynamic Probabilistic Grids Bayesian Occupancy Filter concept Dynamic Probabilistic
More informationSensory Augmentation for Increased Awareness of Driving Environment
Sensory Augmentation for Increased Awareness of Driving Environment Pranay Agrawal John M. Dolan Dec. 12, 2014 Technologies for Safe and Efficient Transportation (T-SET) UTC The Robotics Institute Carnegie
More informationTechnical Bulletin Global Vehicle Target Specification Version 1.0 May 2018 TB 025
Technical Bulletin Global Vehicle Target Specification Version 1.0 May 2018 TB 025 Title Global Vehicle Target Specification Version 1.0 Document Number TB025 Author Euro NCAP Secretariat Date May 2018
More informationEvaluating the Performance of a Vehicle Pose Measurement System
Evaluating the Performance of a Vehicle Pose Measurement System Harry Scott Sandor Szabo National Institute of Standards and Technology Abstract A method is presented for evaluating the performance of
More informationTracking driver actions and guiding phone usage for safer driving. Hongyu Li Jan 25, 2018
Tracking driver actions and guiding phone usage for safer driving Hongyu Li Jan 25, 2018 1 Smart devices risks and opportunities Phone in use 14% Other distractions 86% Distraction-Affected Fatalities
More informationHiFi Visual Target Methods for measuring optical and geometrical characteristics of soft car targets for ADAS and AD
HiFi Visual Target Methods for measuring optical and geometrical characteristics of soft car targets for ADAS and AD S. Nord, M. Lindgren, J. Spetz, RISE Research Institutes of Sweden Project Information
More informationSimulation: A Must for Autonomous Driving
Simulation: A Must for Autonomous Driving NVIDIA GTC 2018 (SILICON VALLEY) / Talk ID: S8859 Rohit Ramanna Business Development Manager Smart Virtual Prototyping, ESI North America Rodolphe Tchalekian EMEA
More informationEfficient and Large Scale Monitoring of Retaining Walls along Highways using a Mobile Mapping System W. Lienhart 1, S.Kalenjuk 1, C. Ehrhart 1.
The 8 th International Conference on Structural Health Monitoring of Intelligent Infrastructure Brisbane, Australia 5-8 December 2017 Efficient and Large Scale Monitoring of Retaining Walls along Highways
More informationAcoustic/Lidar Sensor Fusion for Car Tracking in City Traffic Scenarios
Sensor Fusion for Car Tracking Acoustic/Lidar Sensor Fusion for Car Tracking in City Traffic Scenarios, Daniel Goehring 1 Motivation Direction to Object-Detection: What is possible with costefficient microphone
More informationPattern Recognition for Autonomous. Pattern Recognition for Autonomous. Driving. Freie Universität t Berlin. Raul Rojas
Pattern Recognition for Autonomous Pattern Recognition for Autonomous Driving Raul Rojas Freie Universität t Berlin FU Berlin Berlin 3d model from Berlin Partner Freie Universitaet Berlin Outline of the
More informationCLASSIFICATION FOR ROADSIDE OBJECTS BASED ON SIMULATED LASER SCANNING
CLASSIFICATION FOR ROADSIDE OBJECTS BASED ON SIMULATED LASER SCANNING Kenta Fukano 1, and Hiroshi Masuda 2 1) Graduate student, Department of Intelligence Mechanical Engineering, The University of Electro-Communications,
More information2 OVERVIEW OF RELATED WORK
Utsushi SAKAI Jun OGATA This paper presents a pedestrian detection system based on the fusion of sensors for LIDAR and convolutional neural network based image classification. By using LIDAR our method
More informationA Longitudinal Control Algorithm for Smart Cruise Control with Virtual Parameters
ISSN (e): 2250 3005 Volume, 06 Issue, 12 December 2016 International Journal of Computational Engineering Research (IJCER) A Longitudinal Control Algorithm for Smart Cruise Control with Virtual Parameters
More informationLost! Leveraging the Crowd for Probabilistic Visual Self-Localization
Lost! Leveraging the Crowd for Probabilistic Visual Self-Localization Marcus A. Brubaker (Toyota Technological Institute at Chicago) Andreas Geiger (Karlsruhe Institute of Technology & MPI Tübingen) Raquel
More informationA multilevel simulation framework for highly automated harvest processes enabled by environmental sensor systems
A multilevel simulation framework for highly automated harvest processes enabled by environmental sensor systems Jannik Redenius, M.Sc., Matthias Dingwerth, M.Sc., Prof. Dr. Arno Ruckelshausen, Faculty
More informationOPEN SIMULATION INTERFACE. INTRODUCTION AND OVERVIEW.
OPEN SIMULATION INTERFACE. INTRODUCTION AND OVERVIEW. DIFFERENTIATION OF SIMULATION DATA INTERFACES. Odometry / Dynamics Interface Open Interface (OSI) Map Interface Dynamics Model Map Model Vehicle Dynamics
More informationPedestrian Detection Using Correlated Lidar and Image Data EECS442 Final Project Fall 2016
edestrian Detection Using Correlated Lidar and Image Data EECS442 Final roject Fall 2016 Samuel Rohrer University of Michigan rohrer@umich.edu Ian Lin University of Michigan tiannis@umich.edu Abstract
More informationValidation of advanced driver assistance systems by airborne optical imagery
Validation of advanced driver assistance systems by airborne optical imagery Franz Kurz Remote Sensing Technologies, German Aerospace Center, Münchener Str. 24, 82234 Wessling, Germany Dominik Rosenbaum,
More informationrfpro & DS Products: Integrated development software for human and autonomously driven vehicles
rfpro & DS Products: Integrated development software for human and autonomously driven vehicles rfpro & DS Products: Integrated development software for human andautonomously driven vehicles rfpro The
More informationUsing 3D Laser Range Data for SLAM in Outdoor Environments
Using 3D Laser Range Data for SLAM in Outdoor Environments Christian Brenneke, Oliver Wulf, Bernardo Wagner Institute for Systems Engineering, University of Hannover, Germany [brenneke, wulf, wagner]@rts.uni-hannover.de
More informationAUTOMATIC INTERPRETATION OF HIGH RESOLUTION SAR IMAGES: FIRST RESULTS OF SAR IMAGE SIMULATION FOR SINGLE BUILDINGS
AUTOMATIC INTERPRETATION OF HIGH RESOLUTION SAR IMAGES: FIRST RESULTS OF SAR IMAGE SIMULATION FOR SINGLE BUILDINGS J. Tao *, G. Palubinskas, P. Reinartz German Aerospace Center DLR, 82234 Oberpfaffenhofen,
More information3D-2D Laser Range Finder calibration using a conic based geometry shape
3D-2D Laser Range Finder calibration using a conic based geometry shape Miguel Almeida 1, Paulo Dias 1, Miguel Oliveira 2, Vítor Santos 2 1 Dept. of Electronics, Telecom. and Informatics, IEETA, University
More informationTHE BENEFIT OF ANSA TOOLS IN THE DALLARA CFD PROCESS. Simona Invernizzi, Dallara Engineering, Italy,
THE BENEFIT OF ANSA TOOLS IN THE DALLARA CFD PROCESS Simona Invernizzi, Dallara Engineering, Italy, KEYWORDS automatic tools, batch mesh, DFM, morphing, ride height maps ABSTRACT In the last few years,
More informationConstruction Progress Management and Interior Work Analysis Using Kinect 3D Image Sensors
33 rd International Symposium on Automation and Robotics in Construction (ISARC 2016) Construction Progress Management and Interior Work Analysis Using Kinect 3D Image Sensors Kosei Ishida 1 1 School of
More informationEpipolar geometry-based ego-localization using an in-vehicle monocular camera
Epipolar geometry-based ego-localization using an in-vehicle monocular camera Haruya Kyutoku 1, Yasutomo Kawanishi 1, Daisuke Deguchi 1, Ichiro Ide 1, Hiroshi Murase 1 1 : Nagoya University, Japan E-mail:
More informationCONTRIBUTION TO THE INVESTIGATION OF STOPPING SIGHT DISTANCE IN THREE-DIMENSIONAL SPACE
National Technical University of Athens School of Civil Engineering Department of Transportation Planning and Engineering Doctoral Dissertation CONTRIBUTION TO THE INVESTIGATION OF STOPPING SIGHT DISTANCE
More informationPedestrian Detection Using Multi-layer LIDAR
1 st International Conference on Transportation Infrastructure and Materials (ICTIM 2016) ISBN: 978-1-60595-367-0 Pedestrian Detection Using Multi-layer LIDAR Mingfang Zhang 1, Yuping Lu 2 and Tong Liu
More informationHigh Accuracy Navigation Using Laser Range Sensors in Outdoor Applications
Proceedings of the 2000 IEEE International Conference on Robotics & Automation San Francisco, CA April 2000 High Accuracy Navigation Using Laser Range Sensors in Outdoor Applications Jose Guivant, Eduardo
More informationKorea Autonomous Vehicle Contest 2013
Korea Autonomous Vehicle Contest 2013 Smart Mobility Team (RTOS Lab. & Dyros Lab.) Graduate School of Convergence Science and Technology Seoul National University Page 1 Contents 1. Contest Information
More informationRECOGNITION OF DRIVING MANEUVERS BASED ACCELEROMETER SENSOR
International Journal of Civil Engineering and Technology (IJCIET) Volume 9, Issue 11, November 2018, pp. 1542 1547, Article ID: IJCIET_09_11_149 Available online at http://www.iaeme.com/ijciet/issues.asp?jtype=ijciet&vtype=9&itype=11
More informationAutomated crash computation of passenger car accidents based on the GIDAS database
Automated crash computation of passenger car accidents based on the GIDAS database Abstract M. Wagner, L. Hannawald, H. Liers* * Verkehrsunfallforschung an der TU Dresden (VUFO) GmbH, Semperstraße 2a,
More informationSpatio temporal Segmentation using Laserscanner and Video Sequences
Spatio temporal Segmentation using Laserscanner and Video Sequences Nico Kaempchen, Markus Zocholl and Klaus C.J. Dietmayer Department of Measurement, Control and Microtechnology University of Ulm, D 89081
More informationCorrecting INS Drift in Terrain Surface Measurements. Heather Chemistruck Ph.D. Student Mechanical Engineering Vehicle Terrain Performance Lab
Correcting INS Drift in Terrain Surface Measurements Ph.D. Student Mechanical Engineering Vehicle Terrain Performance Lab October 25, 2010 Outline Laboratory Overview Vehicle Terrain Measurement System
More information1.4.3 OPERATING SPEED CONSISTENCY
Geometric Design Guide for Canadian oads 1.4.3 OPEATING SPEED CONSISTENCY The safety of a road is closely linked to variations in the speed of vehicles travelling on it. These variations are of two kinds:
More informationAutomated Driving Development
Automated Driving Development with MATLAB and Simulink MANOHAR REDDY M 2015 The MathWorks, Inc. 1 Using Model-Based Design to develop high quality and reliable Active Safety & Automated Driving Systems
More informationOption Driver Assistance. Product Information
Product Information Table of Contents 1 Overview... 3 1.1 Introduction... 3 1.2 Features and Advantages... 3 1.3 Application Areas... 4 1.4 Further Information... 5 2 Functions... 5 3 Creating the Configuration
More informationApplication_Database.docx
Application Database Deliverable n. D1.1.1 Application Database Sub Project SP 1 SP Requirements and Specifications Workpackage WP 1.1 Application Needs Task n. T 1.1.2 Application needs identifications
More informationA Street Scene Surveillance System for Moving Object Detection, Tracking and Classification
A Street Scene Surveillance System for Moving Object Detection, Tracking and Classification Huei-Yung Lin * and Juang-Yu Wei Department of Electrical Engineering National Chung Cheng University Chia-Yi
More informationObject Fusion for an Advanced Emergency Braking System (AEBS) Jonny Andersson
Object Fusion for an Advanced Emergency Braking System (AEBS) Agenda 1. Rear- end collisions & EU legislation 2. How the AEB system works 3. Object fusion methods 4. Simulink implementation 5. Sensor visualisation
More informationMap Guided Lane Detection Alexander Döbert 1,2, Andre Linarth 1,2, Eva Kollorz 2
Map Guided Lane Detection Alexander Döbert 1,2, Andre Linarth 1,2, Eva Kollorz 2 1 Elektrobit Automotive GmbH, Am Wolfsmantel 46, 91058 Erlangen, Germany {AndreGuilherme.Linarth, Alexander.Doebert}@elektrobit.com
More informationSOLUTIONS FOR TESTING CAMERA-BASED ADVANCED DRIVER ASSISTANCE SYSTEMS SOLUTIONS FOR VIRTUAL TEST DRIVING
SOLUTIONS FOR TESTING CAMERA-BASED ADVANCED DRIVER ASSISTANCE SYSTEMS SOLUTIONS FOR VIRTUAL TEST DRIVING Table of Contents Motivation... 3 Requirements... 3 Solutions at a Glance... 4 Video Data Stream...
More informationA New Method in Shape Classification Using Stationary Transformed Wavelet Features and Invariant Moments
Original Article A New Method in Shape Classification Using Stationary Transformed Wavelet Features and Invariant Moments Arash Kalami * Department of Electrical Engineering, Urmia Branch, Islamic Azad
More informationRotational3D Efficient modelling of 3D effects in rotational mechanics
Rotational3D - Efficient Modelling of 3D Effects in Rotational Mechanics Rotational3D Efficient modelling of 3D effects in rotational mechanics Johan Andreasson Magnus Gäfvert Modelon AB Ideon Science
More informationMeasuring the World: Designing Robust Vehicle Localization for Autonomous Driving. Frank Schuster, Dr. Martin Haueis
Measuring the World: Designing Robust Vehicle Localization for Autonomous Driving Frank Schuster, Dr. Martin Haueis Agenda Motivation: Why measure the world for autonomous driving? Map Content: What do
More informationOptical Sensors: Key Technology for the Autonomous Car
Optical Sensors: Key Technology for the Autonomous Car Rajeev Thakur, P.E., Product Marketing Manager, Infrared Business Unit, Osram Opto Semiconductors Autonomously driven cars will combine a variety
More informationTurning an Automated System into an Autonomous system using Model-Based Design Autonomous Tech Conference 2018
Turning an Automated System into an Autonomous system using Model-Based Design Autonomous Tech Conference 2018 Asaf Moses Systematics Ltd., Technical Product Manager aviasafm@systematics.co.il 1 Autonomous
More informationTraffic Technology. PoliScan Scanning Lidar: Metrological Principle and Application. VITRONIC Group. 7/15/2013 1
Traffic Technology PoliScan Scanning Lidar: Metrological Principle and Application 7/15/2013 1 Agenda Company Profile Metrological Principles of PoliScan Laser Based Speed Enforcement Laser Based Red Light
More informationCreating Affordable and Reliable Autonomous Vehicle Systems
Creating Affordable and Reliable Autonomous Vehicle Systems Shaoshan Liu shaoshan.liu@perceptin.io Autonomous Driving Localization Most crucial task of autonomous driving Solutions: GNSS but withvariations,
More informationReal-World Static Map Data and its Connection to Dynamic Scenario Generation. Dr.-Ing. Gunnar Gräfe, 3D Mapping Solutions GmbH
Real-World Static Map Data and its Connection to Dynamic Scenario Generation Dr.-Ing. Gunnar Gräfe, 3D Mapping Solutions GmbH 3D Mapping Solutions GmbH The company - Headquarters: Holzkirchen, near Munich,
More informationConstruction of Semantic Maps for Personal Mobility Robots in Dynamic Outdoor Environments
Construction of Semantic Maps for Personal Mobility Robots in Dynamic Outdoor Environments Naotaka Hatao, Satoshi Kagami, Ryo Hanai, Kimitoshi Yamazaki and Masayuki Inaba Abstract In this paper, a construction
More informationDETECTION OF STREET-PARKING VEHICLES USING LINE SCAN CAMERA. Kiyotaka HIRAHARA, Mari MATSUDA, Shunsuke KAMIJO Katsushi IKEUCHI
DETECTION OF STREET-PARKING VEHICLES USING LINE SCAN CAMERA Kiyotaka HIRAHARA, Mari MATSUDA, Shunsuke KAMIJO Katsushi IKEUCHI Institute of Industrial Science, University of Tokyo 4-6-1 Komaba, Meguro-ku,
More informationTRAFFIC DATA FUSION OF VEHICLE DATA TO DETECT SPATIOTEMPORAL CONGESTED PATTERNS
19th ITS World Congress, Vienna, Austria, 22/26 October 2012 EU-00014 TRAFFIC DATA FUSION OF VEHICLE DATA TO DETECT SPATIOTEMPORAL CONGESTED PATTERNS H. Rehborn*, M. Koller#, B. S. Kerner* *Daimler AG,
More informationIMPROVING ADAS VALIDATION WITH MBT
Sophia Antipolis, French Riviera 20-22 October 2015 IMPROVING ADAS VALIDATION WITH MBT Presented by Laurent RAFFAELLI ALL4TEC laurent.raffaelli@all4tec.net AGENDA What is an ADAS? ADAS Validation Implementation
More informationSeam tracking for fillet welds with scanner optics
Lasers in Manufacturing Conference 2015 Seam tracking for fillet welds with scanner optics Friedhelm Dorsch, Holger Braun, Dieter Pfitzner TRUMPF Laser- und Systemtechnik GmbH, Johann-Maus-Str. 2, 71254
More informationCritical Aspects when using Total Stations and Laser Scanners for Geotechnical Monitoring
Critical Aspects when using Total Stations and Laser Scanners for Geotechnical Monitoring Lienhart, W. Institute of Engineering Geodesy and Measurement Systems, Graz University of Technology, Austria Abstract
More informationRECURRENT NEURAL NETWORKS
RECURRENT NEURAL NETWORKS Methods Traditional Deep-Learning based Non-machine Learning Machine-Learning based method Supervised SVM MLP CNN RNN (LSTM) Localizati on GPS, SLAM Self Driving Perception Pedestrian
More informationAdvanced Driver Assistance Systems: A Cost-Effective Implementation of the Forward Collision Warning Module
Advanced Driver Assistance Systems: A Cost-Effective Implementation of the Forward Collision Warning Module www.lnttechservices.com Table of Contents Abstract 03 Introduction 03 Solution Overview 03 Output
More informationANALYZING AND COMPARING TRAFFIC NETWORK CONDITIONS WITH A QUALITY TOOL BASED ON FLOATING CAR AND STATIONARY DATA
15th World Congress on Intelligent Transport Systems ITS Connections: Saving Time, Saving Lives New York, November 16-20, 2008 ANALYZING AND COMPARING TRAFFIC NETWORK CONDITIONS WITH A QUALITY TOOL BASED
More informationWhere s the Boss? : Monte Carlo Localization for an Autonomous Ground Vehicle using an Aerial Lidar Map
Where s the Boss? : Monte Carlo Localization for an Autonomous Ground Vehicle using an Aerial Lidar Map Sebastian Scherer, Young-Woo Seo, and Prasanna Velagapudi October 16, 2007 Robotics Institute Carnegie
More informationDomain Adaptation For Mobile Robot Navigation
Domain Adaptation For Mobile Robot Navigation David M. Bradley, J. Andrew Bagnell Robotics Institute Carnegie Mellon University Pittsburgh, 15217 dbradley, dbagnell@rec.ri.cmu.edu 1 Introduction An important
More informationImproving autonomous orchard vehicle trajectory tracking performance via slippage compensation
Improving autonomous orchard vehicle trajectory tracking performance via slippage compensation Dr. Gokhan BAYAR Mechanical Engineering Department of Bulent Ecevit University Zonguldak, Turkey This study
More informationIP-S2 HD HD IP-S2. 3D Mobile Mapping System. 3D Mobile Mapping System
HD HD 3D Mobile Mapping System 3D Mobile Mapping System Capture Geo-referenced, Time-Stamped Point Clouds and Imagery 3D Scanning of Roadside Features 360º Camera for Spherical Image Capture Dual Frequency
More informationProbabilistic Sensor Models for Virtual Validation Use Cases and Benefits
Probabilistic Sensor Models for Virtual Validation Use Cases and Benefits Dr. Robin Schubert Co-Founder & CEO BASELABS GmbH 2 BASELABS enables data fusion results. 3 Who we are What we do We partner Data
More informationLaser Scanner-Based Navigation and Motion Planning for Truck-Trailer Combinations
Laser Scanner-Based Navigation and Motion Planning for Truck-Trailer Combinations Roland Stahn 1,2, Tobias Stark 1 and Andreas Stopp 1 1 DaimlerChrysler AG, Group Research, Assistance and Safety Systems,
More informationVEHIL - HIL testing of advanced driver assistance systems Ploeg, J.
VEHIL - HIL testing of advanced driver assistance systems Ploeg, J. Published in: Proceedings of the Tagung "Hardware-in-the-Loop Simulation für Mechatronik-Systeme im Kfz", 18-19 Oktober 2005, Essen Published:
More informationT O B C A T C A S E E U R O S E N S E D E T E C T I N G O B J E C T S I N A E R I A L I M A G E R Y
T O B C A T C A S E E U R O S E N S E D E T E C T I N G O B J E C T S I N A E R I A L I M A G E R Y Goal is to detect objects in aerial imagery. Each aerial image contains multiple useful sources of information.
More informationChapter 2 Trajectory and Floating-Car Data
Chapter 2 Trajectory and Floating-Car Data Measure what is measurable, and make measurable what is not so. Galileo Galilei Abstract Different aspects of traffic dynamics are captured by different measurement
More informationRemoving Drift from Inertial Navigation System Measurements RPUG Robert Binns Mechanical Engineering Vehicle Terrain Performance Lab
Removing Drift from Inertial Navigation System Measurements RPUG 2009 Mechanical Engineering Vehicle Terrain Performance Lab December 10, 2009 Outline Laboratory Overview Vehicle Terrain Measurement System
More informationTRAFFIC INFORMATION SERVICE IN ROAD NETWORK USING MOBILE LOCATION DATA
TRAFFIC INFORMATION SERVICE IN ROAD NETWORK USING MOBILE LOCATION DATA Katsutoshi Sugino *, Yasuo Asakura **, Takehiko Daito *, Takeshi Matsuo *** * Institute of Urban Transport Planning Co., Ltd. 1-1-11,
More informationDETECTION OF 3D POINTS ON MOVING OBJECTS FROM POINT CLOUD DATA FOR 3D MODELING OF OUTDOOR ENVIRONMENTS
DETECTION OF 3D POINTS ON MOVING OBJECTS FROM POINT CLOUD DATA FOR 3D MODELING OF OUTDOOR ENVIRONMENTS Tsunetake Kanatani,, Hideyuki Kume, Takafumi Taketomi, Tomokazu Sato and Naokazu Yokoya Hyogo Prefectural
More informationMobile Human Detection Systems based on Sliding Windows Approach-A Review
Mobile Human Detection Systems based on Sliding Windows Approach-A Review Seminar: Mobile Human detection systems Njieutcheu Tassi cedrique Rovile Department of Computer Engineering University of Heidelberg
More informationVISION FOR AUTOMOTIVE DRIVING
VISION FOR AUTOMOTIVE DRIVING French Japanese Workshop on Deep Learning & AI, Paris, October 25th, 2017 Quoc Cuong PHAM, PhD Vision and Content Engineering Lab AI & MACHINE LEARNING FOR ADAS AND SELF-DRIVING
More informationCollision Warning and Sensor Data Processing in Urban Areas
Collision Warning and Sensor Data Processing in Urban Areas Christoph Mertz, David Duggins, Jay Gowdy, John Kozar, Robert MacLachlan, Aaron Steinfeld, Arne Suppé, Charles Thorpe, Chieh-Chih Wang The Robotics
More informationJeffrey A. Schepers P.S. EIT Geospatial Services Holland Engineering Inc. 220 Hoover Blvd, Suite 2, Holland, MI Desk
Jeffrey A. Schepers P.S. EIT Geospatial Services Holland Engineering Inc. 220 Hoover Blvd, Suite 2, Holland, MI 49423 616-594-5127 Desk 616-322-1724 Cell 616-392-5938 Office Mobile LiDAR - Laser Scanning
More informationMotion Classification for Cross Traffic in Urban Environments Using Laser and Radar
Motion Classification for Cross Traffic in Urban Environments Using Laser and Radar Richard Matthaei Institute of Control Engineering Technische Universität Braunschweig Braunschweig, Germany matthaei@ifr.ing.tu-bs.de
More informationThe Influence of Yaw Movements on the Rating of the Subjective Impression of Driving
The Influence of Yaw Movements on the Rating of the Subjective Impression of Driving Thomas Fortmüller, Martin Meywerk Automotive and Powertrain Engineering Helmut-Schmidt-University / University of the
More informationAUTOMATIC DRAWING FOR TRAFFIC MARKING WITH MMS LIDAR INTENSITY
AUTOMATIC DRAWING FOR TRAFFIC MARKING WITH MMS LIDAR INTENSITY G. Takahashi a, H. Takeda a, Y. Shimano a a Spatial Information Division, Kokusai Kogyo Co., Ltd., Tokyo, Japan - (genki_takahashi, hiroshi1_takeda,
More informationAUTOMATIC PARKING OF SELF-DRIVING CAR BASED ON LIDAR
AUTOMATIC PARKING OF SELF-DRIVING CAR BASED ON LIDAR Bijun Lee a, Yang Wei a, I. Yuan Guo a a State Key Laboratory of Information Engineering in Surveying, Mapping and Remote Sensing, Wuhan University,
More informationUAV Autonomous Navigation in a GPS-limited Urban Environment
UAV Autonomous Navigation in a GPS-limited Urban Environment Yoko Watanabe DCSD/CDIN JSO-Aerial Robotics 2014/10/02-03 Introduction 2 Global objective Development of a UAV onboard system to maintain flight
More informationNaturalistic observations to investigate conflicts between drivers and VRU in the PROSPECT project
Naturalistic observations to investigate conflicts between drivers and VRU in the PROSPECT project Marie-Pierre Bruyas, Sébastien Ambellouis, Céline Estraillier, Fabien Moreau (IFSTTAR, France) Andrés
More informationMOBILE INSPECTION SYSTEM FOR HIGH-RESOLUTION ASSESSMENT OF TUNNELS
MOBILE INSPECTION SYSTEM FOR HIGH-RESOLUTION ASSESSMENT OF TUNNELS M. Gavilán*, F. Sánchez, J.A. Ramos and O. Marcos EUROCONSULT GROUP Avenida Montes de Oca 9-11, 28703, Madrid, Spain *Corresponding author:
More information3D Terrain Sensing System using Laser Range Finder with Arm-Type Movable Unit
3D Terrain Sensing System using Laser Range Finder with Arm-Type Movable Unit 9 Toyomi Fujita and Yuya Kondo Tohoku Institute of Technology Japan 1. Introduction A 3D configuration and terrain sensing
More informationControl Challenges in the Vehicle Infrastructure Initiative (VII)
Control Challenges in the Vehicle Infrastructure Initiative (VII) Hemant Sardar Automotive IEEE TCAC Workshop on Open Problems & Challenges in Automotive Control Introduction Background of ITS Overview
More informationRoberto Brignolo The SAFESPOT Integrated Project: Overview of the architecture, technologies and applications
Roberto Brignolo The SAFESPOT Integrated Project: Overview of the architecture, technologies and applications General figures 2 Project type: Integrated Project (IP) Co-funded by: the European Commission
More informationSupplier Business Opportunities on ADAS and Autonomous Driving Technologies
AUTOMOTIVE Supplier Business Opportunities on ADAS and Autonomous Driving Technologies 19 October 2016 Tokyo, Japan Masanori Matsubara, Senior Analyst, +81 3 6262 1734, Masanori.Matsubara@ihsmarkit.com
More informationUncertainties: Representation and Propagation & Line Extraction from Range data
41 Uncertainties: Representation and Propagation & Line Extraction from Range data 42 Uncertainty Representation Section 4.1.3 of the book Sensing in the real world is always uncertain How can uncertainty
More information