Acoustic/Lidar Sensor Fusion for Car Tracking in City Traffic Scenarios

Similar documents
Pattern Recognition for Autonomous. Pattern Recognition for Autonomous. Driving. Freie Universität t Berlin. Raul Rojas

Sensory Augmentation for Increased Awareness of Driving Environment

Option Driver Assistance. Product Information

Vehicle Localization. Hannah Rae Kerner 21 April 2015

Designing a software framework for automated driving. Dr.-Ing. Sebastian Ohl, 2017 October 12 th

Advanced Driver Assistance: Modular Image Sensor Concept

Boresight alignment method for mobile laser scanning systems

Calibration of a rotating multi-beam Lidar

W4. Perception & Situation Awareness & Decision making

Jo-Car2 Autonomous Mode. Path Planning (Cost Matrix Algorithm)

Solid-State Hybrid LiDAR for Autonomous Driving Product Description

AUTOMATIC PARKING OF SELF-DRIVING CAR BASED ON LIDAR

Scene Modeling from Motion-Free Radar Sensing

Automated front wall feature extraction and material assessment using fused LIDAR and through-wall radar imagery

Outline Sensors. EE Sensors. H.I. Bozma. Electric Electronic Engineering Bogazici University. December 13, 2017

Qadeer Baig, Mathias Perrollaz, Jander Botelho Do Nascimento, Christian Laugier

Lecture: Autonomous micro aerial vehicles

Lost! Leveraging the Crowd for Probabilistic Visual Self-Localization

Robotics Project. Final Report. Computer Science University of Minnesota. December 17, 2007

LaserGuard LG300 area alarm system. 3D laser radar alarm system for motion control and alarm applications. Instruction manual

THE RANGER-UAV FEATURES

Solid State LiDAR for Ubiquitous 3D Sensing

Efficient L-Shape Fitting for Vehicle Detection Using Laser Scanners

Evaluation of a laser-based reference system for ADAS

Collaboration is encouraged among small groups (e.g., 2-3 students).

Distance Sensors: Sound, Light and Vision DISTANCE SENSORS: SOUND, LIGHT AND VISION - THOMAS MAIER 1

High Speed Measurement For ADAS And Fast Analysis

A Comparison of Laser Scanners for Mobile Mapping Applications

Robot Mapping. SLAM Front-Ends. Cyrill Stachniss. Partial image courtesy: Edwin Olson 1

Optical Sensors: Key Technology for the Autonomous Car

Classification and Tracking of Dynamic Objects with Multiple Sensors for Autonomous Driving in Urban Environments

Runway Centerline Deviation Estimation from Point Clouds using LiDAR imagery

Pedestrian Detection Using Correlated Lidar and Image Data EECS442 Final Project Fall 2016

Critical Aspects when using Total Stations and Laser Scanners for Geotechnical Monitoring

MATH 1112 Trigonometry Final Exam Review

Construction Progress Management and Interior Work Analysis Using Kinect 3D Image Sensors

Epipolar geometry-based ego-localization using an in-vehicle monocular camera

Lidar-based Transport Research

A Longitudinal Control Algorithm for Smart Cruise Control with Virtual Parameters

Efficient and Large Scale Monitoring of Retaining Walls along Highways using a Mobile Mapping System W. Lienhart 1, S.Kalenjuk 1, C. Ehrhart 1.

Mission interface. Technical Documentation

Intelligent Outdoor Navigation of a Mobile Robot Platform Using a Low Cost High Precision RTK-GPS and Obstacle Avoidance System

Supplier Business Opportunities on ADAS and Autonomous Driving Technologies

Analysis of Different Reference Plane Setups for the Calibration of a Mobile Laser Scanning System

SpatialFuser. Offers support for GNSS and INS navigation systems. Supports multiple remote sensing equipment configurations.

User-assisted Segmentation and 3D Motion Tracking. Michael Fleder Sudeep Pillai Jeremy Scott

Measuring the World: Designing Robust Vehicle Localization for Autonomous Driving. Frank Schuster, Dr. Martin Haueis

Vision-based ACC with a Single Camera: Bounds on Range and Range Rate Accuracy

Laserscanner Based Cooperative Pre-Data-Fusion

> Acoustical feedback in the form of a beep with increasing urgency with decreasing distance to an obstacle

Construction and Calibration of a Low-Cost 3D Laser Scanner with 360º Field of View for Mobile Robots

Optimized Design of 3D Laser Triangulation Systems

Critical Assessment of Automatic Traffic Sign Detection Using 3D LiDAR Point Cloud Data

Exam in DD2426 Robotics and Autonomous Systems

CAMERA POSE ESTIMATION OF RGB-D SENSORS USING PARTICLE FILTERING

Project: IEEE P Working Group for Wireless Personal Area Networks (WPANs)

Fast Local Planner for Autonomous Helicopter

Sensor Fusion: Potential, Challenges and Applications. Presented by KVH Industries and Geodetics, Inc. December 2016

UAV Autonomous Navigation in a GPS-limited Urban Environment

PMD [vision] Day Vol. 3 Munich, November 18, PMD Cameras for Automotive & Outdoor Applications. ifm electronic gmbh, V.Frey. Dr.

Team Description Paper Team AutonOHM

Summary of Computing Team s Activities Fall 2007 Siddharth Gauba, Toni Ivanov, Edwin Lai, Gary Soedarsono, Tanya Gupta

Real Time Multi-Sensor Data Acquisition and Processing for a Road Mapping System

Calibration of a rotating multi-beam Lidar

Estimating Speed of Vehicle using Centroid Method in MATLAB

Description. 2.8 Robot Motion. Floor-mounting. Dimensions apply to IRB 6400/ Shelf-mounting

Multi-View 3D Object Detection Network for Autonomous Driving

3D Modeling of Objects Using Laser Scanning

Federica Zampa Sineco SpA V. le Isonzo, 14/1, Milan, 20135, Italy

Jeffrey A. Schepers P.S. EIT Geospatial Services Holland Engineering Inc. 220 Hoover Blvd, Suite 2, Holland, MI Desk

inter.noise 2000 The 29th International Congress and Exhibition on Noise Control Engineering August 2000, Nice, FRANCE

GPS-Aided Inertial Navigation Systems (INS) for Remote Sensing

Controlled Steering. Prof. R.G. Longoria Spring Relating to Kinematic models of 2D steering and turning

AUTOMOTIVE ENVIRONMENT SENSORS

Robotics. Haslum COMP3620/6320

Evaluating the Performance of a Vehicle Pose Measurement System

Photonic Technologies for LiDAR in Autonomous/ADAS. Jake Li (Market Specialist SiPM & Automotive) Hamamatsu Corporation

Range Sensors (time of flight) (1)

Obstacle Avoidance (Local Path Planning)

Detection and Motion Planning for Roadside Parked Vehicles at Long Distance

Usability study of 3D Time-of-Flight cameras for automatic plant phenotyping

EXPERIMENTAL ASSESSMENT OF THE QUANERGY M8 LIDAR SENSOR

CSE 490R P1 - Localization using Particle Filters Due date: Sun, Jan 28-11:59 PM

Sense Autonomous 2_11. All rights reserved.

Simultaneous Localization and Mapping (SLAM)

Autonomous Mobile Robot Design

10/5/09 1. d = 2. Range Sensors (time of flight) (2) Ultrasonic Sensor (time of flight, sound) (1) Ultrasonic Sensor (time of flight, sound) (2) 4.1.

Overview of the Trimble TX5 Laser Scanner

3D Laser Range Finder Topological sensor gathering spatial data from the surrounding environment

Probabilistic 2D Acoustic Source Localization Using Direction of Arrivals in Robot Sensor Networks

3D Audio Perception System for Humanoid Robots

Motors & Wheels. Wheels can be attached to the KIBO s motors to make them spin. The motors can be attached to the KIBO robot to make it move!

TomTom Innovation. Hans Aerts VP Software Development Business Unit Automotive November 2015

Nonlinear State Estimation for Robotics and Computer Vision Applications: An Overview

Autonomous Navigation for Flying Robots

Sensor Modalities. Sensor modality: Different modalities:

Designing a Pick and Place Robotics Application Using MATLAB and Simulink

Fundamental Technologies Driving the Evolution of Autonomous Driving

Camera and Inertial Sensor Fusion

CALIBRATION AND STABILITY ANALYSIS OF THE VLP-16 LASER SCANNER

Transcription:

Sensor Fusion for Car Tracking Acoustic/Lidar Sensor Fusion for Car Tracking in City Traffic Scenarios, Daniel Goehring 1

Motivation Direction to Object-Detection: What is possible with costefficient microphone arrays, e.g. from Kinect? Fusion of multiple non-synchronized Kinect audio sensors and evaluation with data from Lidar sensors Application of the solution in real-world traffic scenarios 2

Contribution Main components: audio-based detection of objects for a single Kinect microphone array creation of a representation for the belief distribution of object directions combination of belief distributions of two Kinect microphone arrays implementation on a real autonomous car using the OROCOS framework synchronization and evaluation of the algorithm with Lidar point cloud from Ibeo Lux sensors 3

Test platform Vehicle: VW Passat Variant, modified by VW Drive- and Steer-by-Wire, CAN Positioning system: Applanix POS LV 510 IMU, odometer, correction data via UMTS Camera systems: 4 Wide angle cameras 2 INKA Cameras (HellaAglaia) 2 Guppy Cameras for traffic light detection Continental Lane Detection Laser scanner: IBEO Lux 6-Fusion System 3D Laser scanner: Velodyne HDL 64 E Radar systems: 2 short range (BSD 24 GHz) 4 long range (ACC 77 GHz) 1 SMS (24 GHz) 4

Kinect sensor (Schematic) 4 microphones, only the left and right outer microphones were used in our approach (gray circles) outer microphone distance: approx 22 cm 5

Signal shift calculation via crosscorrelation For continuous signals f and g holds: For discrete signals f and g holds: We are interested in the delay n between the two discrete microphone signals: 6

Time delay between to Microphoness the two microphones provide audio signals with a sampling rate of 16.8 khz the time difference for a signal approaching the two microphones is, which translates, given the speed of sound (340 meters per second), into a distance difference of: 7

Time delay between to Microphoness the two microphones have a distance of 0.22 meters (base distance) given the base distance, and the signal shift, for an assumed distance of the object (far away, e.g. 25 m) we have a defined triangle we can calculate the angle to the object w.r.t. symmetry on a plane, two solutions remain 8

Distribution of possible angles to object sampling frequency is limited to 16.8 khz for a given base distance of the two microphones (0.22 m) and a given signal shift, we can have only: 2 * 0.22 m * 16.8 khz / 340 m 22 possible discrete outcomes for angular directions approx. 46 different angular segments (with symmetry) 9

Angular segment distribution different segments (46) cover different angular intervals each segment can be interpreted as a belief cell for an object in an angular direction interval radius for each segment will represent crosscorrelation value (belief) 10

Combination of Kinect sensors Symmetry disambiguation on a plane can be achieved with two Kinect (each 2 microphones), Both devices are rotated by 90 degrees towards each other Kinect 1 can distinguish between left and right but not between front and rear direction Kinect 2 can distinguish between front and rear but not between left and right direction 11

Combination of Kinect sensors Symmetry disambiguation on a plane can be achieved with two Kinect microphone pairs, which are oriented by 90 degrees towards each other For fusion, we subsampled the two non-equally spaced histograms into two qually spaced histograms The value of each non-uniform belief cell is assigned to (split into) the uniform belief cells covered (fully or partially covered) Combination of both kinect belief distributions via cell-wise multiplication 12

Subsampling of belief cells and fusion for two Kinect sensors Kinect 1 orientation forward Kinect 2 orientation sideways 46 segments 64 segments 13

Traffic Example Kinect facing to the front, length of each (non-equal) angular segment represents angular belief Passing car, Lidar data 14

Traffic Example, Step by Step Kinect facing to the front Kinect facing to the side front/rear symmetry (yellow axis) left/right symmetry, (orange axis) 15

Traffic Example, Step by Step nonuniform uniform segments 16

Traffic Example, Step by Step nonuniform uniform segments data fusion, no symmetries 17

Object Direction calculation What is the angle to the object? After fusion, angular segment with the highest value wins (maximum likelihood) Drawback: only one direction possible For multiple objects it would be possible to search for multiple large angular segments (not too close to each other) 18

Experimental setup Approach was tested in our autonomous car in a real traffic situation driving the car created much wind noise car was parked on the side of the read, passing vehicles were detected 19

Experimental evaluation Lidar scanner from Ibeo Lux (6 scanners) used to evaluate accuracy of sound source localization Idea: compare the angle calculated using audio data with the closest angle of moving obstacles from using Lidar Lidar objects were clustered and tracked from point cloud data 20

Demo: Video 1 and 2 21

22

Experimental results Angular error over time w.r.t. Lidar data angular error standard deviation: 10.3 degrees 23

Experimental results (contd.) Angular error over distances w.r.t. Lidar data 24

Experimental results (contd.) Angular error over distances 25

Experimental results (contd.) for close objects it is usually hard to tell the exact angle, due to their size therefore the error for more distant objects was often smaller than for close ones other inaccuracies were caused by sound reflections on houses and trees close to the street errors caused by limited sound velocities in combination with high velocities of cars could not be measured city traffic 50 km/h 26

Conclusion we presented an approach to calculating angles to objects using accoustic data from 2 Kinect microphone pairs showed how data from two non-synchronized devices can be combined using subsampled and uniform angular interval segments Detected angles were assigned and compared to real world Lidar data Approach was implemented on a real autonomous car robotics modular framework (OROCOS) and tested in a real world traffic situation 27

Future Work Current challenges wind noise while driving How to keep track of multiple sources How to handle sound reflections, e.g. from buildings, trees, etc. 28

Future Work (contd.) How can the signal intensity used for distance estimation Band pass filters / FFT can help to select specific signals, e.g. emergency vehicles with certain signal horn frequencies and signal patterns furthermore try to detect alternation between two frequencies Use tracking in combination with doppler effect to estimate velocities while vehicle is passing (change in frequency) 29