AUTOMATED GENERATION OF VIRTUAL DRIVING SCENARIOS FROM TEST DRIVE DATA

Similar documents
Evaluation of a laser-based reference system for ADAS

Laserscanner Based Cooperative Pre-Data-Fusion

W4. Perception & Situation Awareness & Decision making

IMPROVING ADAS VALIDATION WITH MBT

Advanced Driver Assistance: Modular Image Sensor Concept

Simulation: A Must for Autonomous Driving

DSRC Field Trials Whitepaper

Roberto Brignolo The SAFESPOT Integrated Project: Overview of the architecture, technologies and applications

Cohda Wireless White Paper DSRC Field Trials

MATLAB Expo 2014 Verkehrszeichenerkennung in Fahrerassistenzsystemen Continental

차세대지능형자동차를위한신호처리기술 정호기

Sensory Augmentation for Increased Awareness of Driving Environment

Application_Database.docx

LOW COST ADVANCDED DRIVER ASSISTANCE SYSTEM (ADAS) DEVELOPMENT

Verification of Collision Avoidance Functionality A mileage multiplier approach to verify future Collision Avoidance Systems

Advanced Driver Assistance Systems: A Cost-Effective Implementation of the Forward Collision Warning Module

Automated Driving Development

Dedicated Short Range Communication: What, Why and How?

VEHIL - HIL testing of advanced driver assistance systems Ploeg, J.

> Acoustical feedback in the form of a beep with increasing urgency with decreasing distance to an obstacle

Precision Roadway Feature Mapping Jay A. Farrell, University of California-Riverside James A. Arnold, Department of Transportation

Vehicle Localization. Hannah Rae Kerner 21 April 2015

Cooperative Vehicles Opportunity and Challenges

Examining future priorities for cyber security management

Team Aware Perception System using Stereo Vision and Radar

Future Implications for the Vehicle When Considering the Internet of Things (IoT)

Arm Technology in Automotive Geely Automotive Shanghai Innovation Center

Map Guided Lane Detection Alexander Döbert 1,2, Andre Linarth 1,2, Eva Kollorz 2

Designing a software framework for automated driving. Dr.-Ing. Sebastian Ohl, 2017 October 12 th

A System for Real-time Detection and Tracking of Vehicles from a Single Car-mounted Camera

Integrated Radar-Camera Sensors: The Next Generation of Sensor Fusion. William A. Bauson Delphi Electronics and Safety Jan.

Pattern Recognition for Autonomous. Pattern Recognition for Autonomous. Driving. Freie Universität t Berlin. Raul Rojas

Object Fusion for an Advanced Emergency Braking System (AEBS) Jonny Andersson

Supplier Business Opportunities on ADAS and Autonomous Driving Technologies

SIMULATION ENVIRONMENT

Dan Galves, Sr VP Communications, Mobileye. January 17, 2018

Option Driver Assistance. Product Information

Towards Fully-automated Driving. tue-mps.org. Challenges and Potential Solutions. Dr. Gijs Dubbelman Mobile Perception Systems EE-SPS/VCA

VIA Mobile360 Surround View

2 OVERVIEW OF RELATED WORK

Analyzing the Relationship Between Head Pose and Gaze to Model Driver Visual Attention

Review of distracted driving factors. George Yannis, Associate Professor, NTUA

Vision-based ACC with a Single Camera: Bounds on Range and Range Rate Accuracy

High-Level Sensor Data Fusion Architecture for Vehicle Surround Environment Perception

Naturalistic observations to investigate conflicts between drivers and VRU in the PROSPECT project

SOLUTIONS FOR TESTING CAMERA-BASED ADVANCED DRIVER ASSISTANCE SYSTEMS SOLUTIONS FOR VIRTUAL TEST DRIVING

Team-LUX DARPA Urban Challenge Technical Paper, June 1st, 2007

OPEN SIMULATION INTERFACE. INTRODUCTION AND OVERVIEW.

Testing of automated driving systems

Emergency Response: How dedicated short range communication will help in the future. Matthew Henchey and Tejswaroop Geetla, University at Buffalo

To realize Connected Vehicle Society. Yosuke NISHIMURO Ministry of Internal Affairs and Communications (MIC), Japan

Towards Autonomous Vehicle. What is an autonomous vehicle? Vehicle driving on its own with zero mistakes How? Using sensors

Innovative M-Tech projects list

1.4.3 OPERATING SPEED CONSISTENCY

Collision Warning and Sensor Data Processing in Urban Areas

Measuring the World: Designing Robust Vehicle Localization for Autonomous Driving. Frank Schuster, Dr. Martin Haueis

Design Considerations And The Impact of CMOS Image Sensors On The Car

Deliverable 2.6. Technical report on data collection and validation. Version: 1.0 Date: Author: P. Sukennik, V. Zeidler, J.

Real-World Static Map Data and its Connection to Dynamic Scenario Generation. Dr.-Ing. Gunnar Gräfe, 3D Mapping Solutions GmbH

FPGA Image Processing for Driver Assistance Camera

Layer-based Multi-sensor Fusion Architecture for Cooperative and Automated Driving Application Development

Extended Floating Car Data System - Experimental Study -

ENABLING MOBILE INTERFACE BRIDGING IN ADAS AND INFOTAINMENT APPLICATIONS

QUASI-3D SCANNING WITH LASERSCANNERS

ISO/TS TECHNICAL SPECIFICATION. Transport information and control systems Traffic Impediment Warning Systems (TIWS) System requirements

Workpackage WP2.5 Platform System Architecture. Frank Badstübner Ralf Ködel Wilhelm Maurer Martin Kunert F. Giesemann, G. Paya Vaya, H.

Distracted Driving & Voice-to-Text Overview. Christine Yager Texas A&M Transportation Institute

Features. Main Features of OPTIAN!! FCW ( Forward Collision Warning ) LDW (Lane Departure Warning ) FCDA ( Front Car Departure Alert )

Functional Safety Architectural Challenges for Autonomous Drive

SPAD Mobile Device Sub Group

HiFi Visual Target Methods for measuring optical and geometrical characteristics of soft car targets for ADAS and AD

An Efficient Algorithm for Forward Collision Warning Using Low Cost Stereo Camera & Embedded System on Chip

Photogrammetry and 3D Car Navigation

Creating Affordable and Reliable Autonomous Vehicle Systems

ADAS: A Safe Eye on The Road

The Safe State: Design Patterns and Degradation Mechanisms for Fail- Operational Systems

Applying AI to Mapping

Mobile Mapping and Navigation. Brad Kohlmeyer NAVTEQ Research

Jo-Car2 Autonomous Mode. Path Planning (Cost Matrix Algorithm)

Cell Phones & Distracted Driving

Spatio temporal Segmentation using Laserscanner and Video Sequences

Sensor Data Fusion for Active Safety Systems

Real-Time Insights from the Source

Can We Improve Autonomous Driving With IoT and Cloud?

Robotics Project. Final Report. Computer Science University of Minnesota. December 17, 2007

Status of Connected Vehicle technology development for Automated Driving in Japan

NVIDIA AUTOMOTIVE. Driving Innovation

On-road obstacle detection system for driver assistance

Turning an Automated System into an Autonomous system using Model-Based Design Autonomous Tech Conference 2018

Integrated Vehicle and Lane Detection with Distance Estimation

Mobile Computing Systems Lecture on

Reporting road hazards using in-vehicle camera

CLASSIFICATION FOR ROADSIDE OBJECTS BASED ON SIMULATED LASER SCANNING

TomTom Innovation. Hans Aerts VP Software Development Business Unit Automotive November 2015

Tightly-Coupled LIDAR and Computer Vision Integration for Vehicle Detection

Next Generation Infotainment Systems

Solid State LiDAR for Ubiquitous 3D Sensing

BEFORE YOU BUY: SEVEN CRITICAL QUESTIONS TO ASK ABOUT LASER SCANNERS. Robert Gardiner

On Board 6D Visual Sensors for Intersection Driving Assistance Systems

Hardening Attack Vectors to cars by Fuzzing

Transcription:

F2014-ACD-014 AUTOMATED GENERATION OF VIRTUAL DRIVING SCENARIOS FROM TEST DRIVE DATA 1 Roy Bours (*), 1 Martijn Tideman, 2 Ulrich Lages, 2 Roman Katz, 2 Martin Spencer 1 TASS International, Rijswijk, The Netherlands; 2 Ibeo Automotive Systems GmbH, Hamburg, Germany KEYWORDS ADAS, reference sensing, simulation, virtual testing, test driving ABSTRACT Intelligent vehicle systems such as ADAS, ITS and automated driving systems consist of increasing numbers of sensor technologies as well as increasingly advanced algorithms for sensor fusion, object tracking, object classification, risk estimation, driver status recognition and vehicle control. It is rapidly becoming infeasible to check the performance of each new (sensor) system in the traditional way: By performing test drives, storing data, manually labelling the data for reference, and manually evaluating the results. One of the approaches to address these difficulties is to install a reference sensor system on the test vehicle in addition to the prototype sensor system (device under test). The recorded data from the reference sensor system are processed partly or fully - automatically to create reference scenarios, based on automatic object labelling and automatic event identification. Based on these reference data, the device under test can be automatically and objectively evaluated. The reference data from the reference sensor system can now be converted into a set of virtual scenarios which can be used within a CAE environment. These simulated ground-truth reference scenarios offer a platform for engineers to quickly check the consequences of design changes to the device under test, and allow engineers to subject the device under test to a wide variation of virtual traffic scenarios. INTRODUCTION In recent years, automotive active safety systems have become more prevalent. Furthermore, these systems will be used as the foundation for the roll-out of autonomous driven vehicles. There are a variety of applications that encompass active safety including: Adaptive Cruise Control (ACC), Forward Collision Warning (FCW), Automatic Emergency Braking (AEB), Lane Departure Warning (LDW), Blind Spot Warning (BLSW), Lane Keeping Assistance (LKA), Pedestrian Avoidance (PA), Intelligent Headlight Systems (IHS) and Cooperative Driving Systems (CDS). With the rising level of automation onboard vehicles, intelligent vehicle systems have to deal with an increasing amount of complex traffic scenarios. At the same time, intelligent systems themselves are also becoming more complicated. They consist of more and more different sensor technologies as well as increasingly advanced algorithms for sensor fusion, object tracking, classification, risk estimation, driver status recognition and vehicle control. As a result, it is rapidly becoming infeasible to check the performance of each new (sensor) system in the traditional way by driving around, storing data, manually labeling the data for reference, and manually evaluating the results. ADAS TESTING USING REFERENCE SENSOR SYSTEMS One of the approaches to address these difficulties is to install an additional reference sensor system (RSS) on the vehicle which is already equipped with the prototype sensor system that is going to be tested (DUT device under test). The recorded data from the reference sensor system are processed to automatically create reference scenarios, based on automatic labeling of relevant road users and background environment description. Based on these reference data, the device-under-test can be automatically and objectively verified [1,2]. LASER SCANNER ADVANCEMENTS Several advancements in the field of laser scanning [3] allow the sensor technology to be used as a reference sensor system for automatically generating reference scenarios from laser scan data. The current generation of laser scanners can detect objects with a very high precision relative to the scanner. If a digital map is available

which contains landmarks with centimeter accuracy, then the absolute position of the laser scanner can be determined to within a few centimeters, thus improving the accuracy of all tracked objects. The scanner is effectively used as a differential sensor, analogous to Differential GPS (DGPS), but without the need for a base station or for a clear view of the sky. FORWARD-BACKWARD TRACKING A further improvement is the use of offline processing to fully analyze the scenario. State of the art online tracking algorithms are inherently limited by the requirement to run in real time. If the data is reprocessed offline, it becomes possible to look ahead for all observations of an object and associate them with the first instant that the object is visible, which allows the system to be used as a true reference, since no online sensor can look into the future. This also allows the use of a Best Situation Classifier (BSC). The point in time where an object is most clearly visible can be used to classify the object, and this classification can be extended to the lifetime of the object. Additional benefits are improved robustness to occlusions, reduced uncertainty of the ego vehicle position between landmarks, and increased accuracy of object trajectories. As an example, consider the scenario in Figure 1a. A sign and an oncoming vehicle are just visible by the laser scanner. Figure 1b and 1c show the results of two tracking approaches. In the online approach (Figure 1b), the approaching vehicle is initially detected, but it is not yet classified and the outline is not clear. As the vehicle gets closer, the outline can be clearly seen and it is classified as a car. In the offline approach (Figure 1c), this information is associated with the object in the first instant that it is visible, allowing a more accurate reconstruction of the scenario. Figure 1a: Online object tracking at time T0. Figure 1b: Online object tracking at time TX. Figure 1C: Forward Backward Tracking principle This process, which provides object information in full detail already at the time of the first detection of the object at high distance, is called Forward Backward Tracking (FBT). This ensures that the laser scanners, used as a reference sensor system, provide detailed object information very early and with high precision at any time, as shown in the following example (Figure 2).

Online processed data (only forward-tracking on the vehicle) Post-Processing with BSC and FBT Figure 2: Application example of post-processing using Best Situation Classification (BSC) and Forward Backward Tracking (FBT) VIRTUAL DRIVING SCENARIOS FROM REAL-WORLD TEST DRIVES While test driving is still the main method for ADAS system, the resulting data rarely contain events that would be truly tax the active safety system evaluation; even with a million miles driven. Crashes are rare and difficult to capture. Even near crashes are rare. Hence, we end up with much of the collected data being simple false alarm data (i.e. driving with no difficult decision to make). Many of the drawbacks of hardware testing of ADAS systems [4] are not present for a virtual test environment. Virtual testing with simulation software provides an efficient and safe environment to design and evaluate ADAS systems. Moreover, simulated scenarios are completely quantifiable, controllable and reproducible. However, the creation of many virtual driving scenarios can be a time-consuming and laborious activity; in particular, when real testing scenarios and conditions must be manually converted into virtual scenarios for further testing. It is thus extremely valuable to be able to automatically obtain virtual scenarios from real world test drives. The reference data from the reference sensor system as presented in previous section are ideal to serve as basis for automatic creation of virtual scenarios. They can be converted into a set of simulated scenarios which can be used within a CAE environment. These simulated ground-truth reference scenarios offer a platform for engineers to quickly check the consequences of design changes to the device under test, or even to try out completely new system concepts.

RESULTS The concept of creation of virtual driving scenarios from real-world test drives was evaluated using an Ibeo laser scanner setup with two lasers scanners (see Figure 3). Figure 3: Ibeo Sensor System mounted on a test vehicle. Red circles indicate two Lux scanners from a 2- Sensor fusion system mounted on the front bumper of the car The recorded data is automatically labelled with a post-processing tool. These labels are then provided for the processing and evaluation of a device under test, and the conversion to virtual driving scenarios in the simulation software PreScan [5]. Figure 4a and 4b show a sample result in which a passenger car and pedestrians are detected, classified, and converted to a virtual environment. The applied laser system is able to classify passenger cars, trucks, motorcycles, bicyclists and pedestrians. Figure 4a: Reference scenario from real world test drive Figure 4b: Virtual driving scenarios derived from real world test data

CONCLUSION This paper presented the use of advanced perception systems for obtaining reference data for the automated generation of simulated driving scenarios. We described our advances in the fields of laser scanning processing for reference generation, and illustrated the use of reference data for constructing simulated virtual scenarios that can be loaded, manipulated and used within a commercial simulator. REFERENCES (1) Ulrich Lages, Martin Spencer, Roman Katz, Automatic Scenario Generation based on Laserscanner Reference Data and Advanced Offline Processing, Proceedings of IEEE Intelligent Vehicle Conference, Gold Coast, Australia, June 2013. (2) Martin Spencer, Roman Katz, Ulrich Lages, Forward-Backward Object Tracking for Generation of Reference Scenarios Based on Laser Scan Data, Proceedings of ITS World Congress, Detroit, USA, 2014 [pending approval]. (3) http://www.ibeo-as.com, Ibeo Automotive Systems GmbH, 2014 (4) Roy Bours, Tino Schulze, Takahito Nakano, Eiji Teramura, Martijn Tideman, Integral Simulation Toolset for ADAS Modeling, Proceedings of JSAE Conference, Yokohama, Japan, May 2012 (5) http://www.tassinternational.com/prescan, TASS International, 2014