Solid State LiDAR for Ubiquitous 3D Sensing

Similar documents
Why the Self-Driving Revolution Hinges on one Enabling Technology: LiDAR

W4. Perception & Situation Awareness & Decision making

Advanced Driver Assistance: Modular Image Sensor Concept

Today's LiDARs and GPUs Enable Ultra-accurate GPS-free Navigation with Affordable SLAM

Designing a software framework for automated driving. Dr.-Ing. Sebastian Ohl, 2017 October 12 th

Autonomous Driving Solutions

Towards Fully-automated Driving. tue-mps.org. Challenges and Potential Solutions. Dr. Gijs Dubbelman Mobile Perception Systems EE-SPS/VCA

Emerging Vision Technologies: Enabling a New Era of Intelligent Devices

Measuring the World: Designing Robust Vehicle Localization for Autonomous Driving. Frank Schuster, Dr. Martin Haueis

Turning an Automated System into an Autonomous system using Model-Based Design Autonomous Tech Conference 2018

Precision Roadway Feature Mapping Jay A. Farrell, University of California-Riverside James A. Arnold, Department of Transportation

TomTom Innovation. Hans Aerts VP Software Development Business Unit Automotive November 2015

Connected Car. Dr. Sania Irwin. Head of Systems & Applications May 27, Nokia Solutions and Networks 2014 For internal use

Can We Improve Autonomous Driving With IoT and Cloud?

Photonic Technologies for LiDAR in Autonomous/ADAS. Jake Li (Market Specialist SiPM & Automotive) Hamamatsu Corporation

Vehicle Localization. Hannah Rae Kerner 21 April 2015

Creating Affordable and Reliable Autonomous Vehicle Systems

Simulation: A Must for Autonomous Driving

Towards Autonomous Vehicle. What is an autonomous vehicle? Vehicle driving on its own with zero mistakes How? Using sensors

Vision based autonomous driving - A survey of recent methods. -Tejus Gupta

Realtime Object Detection and Segmentation for HD Mapping

Solid-State Hybrid LiDAR for Autonomous Driving Product Description

Laserscanner Based Cooperative Pre-Data-Fusion

ADAS: A Safe Eye on The Road

Pattern Recognition for Autonomous. Pattern Recognition for Autonomous. Driving. Freie Universität t Berlin. Raul Rojas

Sensor Fusion: Potential, Challenges and Applications. Presented by KVH Industries and Geodetics, Inc. December 2016

Optical Sensors: Key Technology for the Autonomous Car

TEGRA K1 AND THE AUTOMOTIVE INDUSTRY. Gernot Ziegler, Timo Stich

Sensory Augmentation for Increased Awareness of Driving Environment

Acoustic/Lidar Sensor Fusion for Car Tracking in City Traffic Scenarios

Matt Ronning Automotive sub-group Chairman. MIPI Alliance Extends Interface Standards to Support Automotive Market

> Acoustical feedback in the form of a beep with increasing urgency with decreasing distance to an obstacle

ESPROS Photonics Corporation

Supplier Business Opportunities on ADAS and Autonomous Driving Technologies

LiDAR for ADAS and Autonomous Driving (Intro to LiDAR Face-Off at Sensors EXPO 2018)

White Paper. Connected Car Brings Intelligence to Transportation

Technical Bulletin Global Vehicle Target Specification Version 1.0 May 2018 TB 025

GPS-Aided Inertial Navigation Systems (INS) for Remote Sensing

Automotive LiDAR. General Motors R&D. Ariel Lipson

Option Driver Assistance. Product Information

Automated Driving Development

Design your autonomous vehicle applications with NVIDIA DriveWorks components on RTMaps

Pedestrian Detection Using Correlated Lidar and Image Data EECS442 Final Project Fall 2016

Small is the New Big: Data Analytics on the Edge

NVIDIA AUTOMOTIVE. Driving Innovation

Fundamental Technologies Driving the Evolution of Autonomous Driving

Rapid Building information modeling. Ivar Oveland 2013

Korea ICT Market Overview. Yoonmi Kim Finpro Korea

LOW COST ADVANCDED DRIVER ASSISTANCE SYSTEM (ADAS) DEVELOPMENT

Collaborative Mapping with Streetlevel Images in the Wild. Yubin Kuang Co-founder and Computer Vision Lead

Image processing techniques for driver assistance. Razvan Itu June 2014, Technical University Cluj-Napoca

Arm Technology in Automotive Geely Automotive Shanghai Innovation Center

CONTENT ENGINEERING & VISION LABORATORY. Régis Vinciguerra

ENY-C2005 Geoinformation in Environmental Modeling Lecture 4b: Laser scanning

Autonomous navigation in industrial cluttered environments using embedded stereo-vision

P I X E V I A : A I B A S E D, R E A L - T I M E C O M P U T E R V I S I O N S Y S T E M F O R D R O N E S

Automotive and Aerospace Synergies

UAV Autonomous Navigation in a GPS-limited Urban Environment

Giancarlo Vasta, Magneti Marelli, Lucia Lo Bello, University of Catania,

IP-S2 HD. High Definition 3D Mobile Mapping System

VIA Mobile360 Surround View

How Microcontrollers help GPUs in Autonomous Drive

RTMaps Embedded facilitating development and testing of complex HAD software on modern ADAS platforms

Designing GPU-accelerated applications with RTMaps (Real-Time Multisensor Applications) Framework and NVIDIA DriveWorks

High Resolution Laserscanning, not only for 3D-City Models

Autorama, Connecting Your Car to

Communication Patterns in Safety Critical Systems for ADAS & Autonomous Vehicles Thorsten Wilmer Tech AD Berlin, 5. March 2018

Overview of the Trimble TX5 Laser Scanner

Introduction to Autonomous Mobile Robots

Detecting the Unexpected: The Path to Road Obstacles Prevention in Autonomous Driving

OFERTA O120410PA CURRENT DATE 10/04//2012 VALID UNTIL 10/05/2012 SUMMIT XL

차세대지능형자동차를위한신호처리기술 정호기

MULTI-MODAL MAPPING. Robotics Day, 31 Mar Frank Mascarich, Shehryar Khattak, Tung Dang

2/19/2018. Who are we? Who am I? What is Scanning? How does scanning work? How does scanning work? Scanning for Today s Surveyors

Cloud-based Large Scale Video Analysis

New demand, new markets: What edge computing means for hardware companies

PMD [vision] Day Vol. 3 Munich, November 18, PMD Cameras for Automotive & Outdoor Applications. ifm electronic gmbh, V.Frey. Dr.

Application_Database.docx

AUTOMATIC PARKING OF SELF-DRIVING CAR BASED ON LIDAR

Airborne Laser Scanning: Remote Sensing with LiDAR

Mini Survey Paper (Robotic Mapping) Ryan Hamor CPRE 583 September 2011

Lecture: Autonomous micro aerial vehicles

Robotics. Haslum COMP3620/6320

Advanced IP solutions enabling the autonomous driving revolution

VISION FOR AUTOMOTIVE DRIVING

Pervasive Computing. OpenLab Jan 14 04pm L Institute of Networked and Embedded Systems

Design Considerations And The Impact of CMOS Image Sensors On The Car

Satellite Technology Trends - A perspective from Intelsat

Parallel Scheduling for Cyber-Physical Systems: Analysis and Case Study on a Self-Driving Car

Next Generation Infotainment Systems

Presenter: Felix Goldberg, Ph.D. Chief AI Scientist Artificial Intelligence Cart AIC

ToF Camera for high resolution 3D images with affordable pricing

Multi-View 3D Object Detection Network for Autonomous Driving

ENABLING MOBILE INTERFACE BRIDGING IN ADAS AND INFOTAINMENT APPLICATIONS

(12) Patent Application Publication (10) Pub. No.: US 2016/ A1

SAP HANA Spatial Location-based business platform

Applying AI to Mapping

IP-S2 HD HD IP-S2. 3D Mobile Mapping System. 3D Mobile Mapping System

Hezi Saar, Sr. Staff Product Marketing Manager Synopsys. Powering Imaging Applications with MIPI CSI-2

2 OVERVIEW OF RELATED WORK

Transcription:

April 6, 2016 Solid State LiDAR for Ubiquitous 3D Sensing Louay Eldada, Ph.D. CEO, Co-founder Quanergy Systems

New Paradigm in 3D Sensing Disruptive Technologies: Solid State 3D LiDAR sensors Embedded processors (GPU) Inertial Measurement Units (IMU) Advanced Systems: Autonomous Vehicles Smart Homes, Smart Security Robots, Drones, 3D-Aware Devices Smart Solutions: Daily-updated cm-accurate Global 3D Map GPS-free Navigation through SLAM Smart IoT 2

LiDAR Application Pillars Industrial Automation Mapping Transportation Security 3

Some LiDAR Applications 3D Mapping & Navigation Safe & Autonomous Vehicles Fleets Terrestrial & Aerial Robotics 3D LiDAR sensors enable safety and efficiency in areas unserved due to: (1) COST (2) PERFORMANCE (3) RELIABILITY (4) SIZE (5) WEIGHT (6) POWER Smart Cities Industrial (Mining, Logistics, etc.) Smart Homes 3D-Aware Smart Devices 4

Why LiDAR LiDAR is most accurate perception sensor: 3D shape with width/height information Distance with high accuracy Orientation Intensity LiDAR Radar Video Range +++ +++ - Range Rate ++ +++ - Field of View +++ ++ + Width & Height +++ - + 3D Shape +++ - - Object Rec @ Long Range +++ - - Accuracy +++ - + Rain, Snow, Dust ++ +++ - Fog + +++ - Night time +++ +++ - Read Signs & See Color + - +++ Transmitter Receiver Obstacle Time of Flight Measurement Historically, LiDARs have been expensive, bulky, unreliable (mechanical failure) 5

Quanergy vs. Traditional LiDAR Traditional Solution Expensive, Large, Heavy, High Power, Low Performance, Low Reliability, Mechanical LiDAR Quanergy Solution Low Cost, Compact, Lightweight, Low Power, High Performance, High Reliability, Solid State LiDAR 6

Quanergy LiDAR Roadmap All Solid State Gen 3 Solid State (S3 ASIC) Gen 1 Mechanical (Mark VIII M8) Gen 2 Solid State (S3 MCM) Volume Pricing: Gen 1: <$1,000 Gen 2: <$250 Gen 3: <$100 7

Designs focus simultaneously on cost, performance, reliability, size, weight, power consumption Gen 1: Mechanical LiDAR (M8) Gen 2 & 3: Solid State LiDAR (S3) Quanergy LiDARs 8 Patents Pending 15 Patents in preparation covering Gen 1, 2 & 3 8

Quanergy S3 LiDAR 60 mm 9

S3 Operation Principle Far Field Spot OPA Transmitter OPA (Optical Phased Array) Photonic IC with far field radiation pattern (laser spot) Overlaid far-field patterns for various steering angles 10

OPA Operation Principle 1xN Splitter Phase Shifters Laser OPA stands for Optical Phased Array Optical analog of Phased Array Radar An optical phased array has multiple optical antenna elements that are fed equal-intensity coherent signals Variable phase control is used at each element to generate a far-field radiation pattern and point it in a desired direction 11

S3 Unique Capabilities Software-controlled in real time: Adjustable window within total available field of view Arbitrary distribution of points in point cloud; point density within a frame not necessarily uniform (e.g., denser distribution around horizon in vehicle) Random access for maximum SNR at receiver Largest VFOV (matches 120 HFOV) Zoom in & out for coarse & fine view Adjustable frame rate based on situation analysis Directional range enhancement based on location in pre-existing map (e.g., maximum forward range on highway, maximum sideways range at intersection) 12

Today s ADAS Use Various Sensors Lane Keeping Parking Assist Blind Spot Detection Adaptive Cruise Control & Traffic Jam Assist Front/Rear Collision Avoidance Cross Traffic Alert & Intersection Collision Avoidance Autonomous Emergency Braking & Emergency Steer Assist Object Detection, Tracking, Classification Scene Capture & Accident Reconstruction 13

Autonomous Vehicle Sensors Quanergy 14

Autonomous Car Sensing Systems Video Perception for Semi Autonomy Mechanical LiDAR Perception for Autonomy Solid State LiDAR Perception for Autonomy 8 video cameras 2 LiDARs 2 LiDARs with video 6 radars 8 video cameras 2 radars 12 U/S sensors 6 radars 12 U/S sensors Total: 26 sensors Total: 28 sensors Total: 4 sensors ASP: $4,000 ASP: $6,000-$20,000 ASP: $1,000 LiDAR only acceptable sensor for object detection in autonomous cars operating in all environments, including urban areas with pedestrians (not just highways) Sensors that detect and help avoid collision with 99% of objects (pedestrians, cyclists, vehicles, etc.) are unacceptable in fully autonomous cars goal: 10 9 s When LiDARs are mission critical, as in autonomous cars, they cannot have moving parts and replace sensors in today s sensing suite; must be solid state 15

S3 LiDAR Launched at CES 2016 Two S3 LiDARs installed in grill of Mercedes- Benz GLE450 AMG Coupe (Daimler gift) Sensors invisible behind IR-transparent fascia (built by Delphi) Pedestrians in front of vehicle detected and point cloud displayed in real time 16

Tegra X1 Based Automotive System Real-time object detection, tracking and classification is important in ADAS and critical in Autonomous Vehicles Tegra X1 (256-core Maxwell GPU, 8-core 64-bit CPU) is preferred processor for neural network deep learning in 3D sensing applications. GPU parallel cores are leveraged to: Rapidly train neural networks using large training sets Perform classification and prediction on trained sensors Tegra X1 17

Nvidia DRIVE PX for AV An autonomous vehicle (AV) needs to know its location accurately, recognize objects around it, and continuously calculate the optimal path for a safe driving experience The situational and contextual awareness of the car and its surroundings demands a powerful computing system such as DRIVE PX that can merge data from LiDARS, cameras, other sensors, and navigation sources, while figuring out the safest path in real-time DRIVE PX combines deep learning, sensor fusion, scenario analysis, decision making, and triggering action DRIVE PX enables self-driving applications to be developed faster and more accurately Key features of the platform include: Dual NVIDIA Tegra X1 processors deliver a combined 2.3 TFLOPS Interfaces for up to 12 cameras, radar, lidar, and ultrasonic sensors Rich middleware for graphics, computer vision, and deep learning Periodic software/os updates 18

DRIVE PX 2 Nvidia s DRIVE PX 2 Module: up to 8 TFLOPS 19

Autonomous Driving Based on Deep Learning Vehicle Configuration 4 Quanergy LiDARs on 4 corners of vehicle 480,000 samples/sec per LiDAR (frame rate 1-1000 frames/sec in S3) 1,920,000 samples/sec for 4 LiDARs 4 surround view video cameras on 4 corners of vehicle, soon to be integrated with LiDAR 240,000 points/sec per camera (8,000 points/frame, 30 frames/sec) 960,000 points/sec for 4 cameras GPS with 2-3 m positioning accuracy IMU (Inertial Measurement Unit) includes accelerometer and gyroscope Car sensors: speed of wheels, turning angles of wheels, etc. 20

Autonomous Driving Based on Deep Learning Perception Pipeline 1. Vehicle LiDAR Raw Input Corrected point cloud using IMU, and video frames 2. Occupancy Map Created using LiDAR-video sensor fusion, probabilistic map informs which voxels are likely occupied 3. Object Detection Occupancy Grid Detection and Tracking Run LiDAR and video output into a neural network that was trained to recognize and classify objects (cars, bikes, pedestrians, etc.) 4. Localization Determine position by registering within a pre-existing HD map, localize in a lane: use GPS, place car in lane, compensate for errors of GPS (GPS accuracy: several meters, accuracy needed: several cm) 5. Path Planning Run algorithms to perform path/motion planning, taking into consideration car kinematics, decide whether to stay in lane or switch lanes 6. Navigation After intensive computation, if decision is to take action, actuation in near-real time of vehicle controls to ensure safety and comfort 21

Global Positioning of Point Clouds Vehicle LiDAR Raw Data Nvidia data collected with Quanergy LiDAR sensors 22

Global Positioning of Point Clouds Object Detection Occupancy Grid Detection and Tracking Nvidia data collected with Quanergy LiDAR sensors 23

Global Positioning of Point Clouds Path Planning Nvidia data collected with Quanergy LiDAR sensors 24

LiDAR-Video Fusion & Deep Learning CES 2016 nvidia Booth Robotics Trends All You Need to Know About Self-Driving Cars from CES By Brad Templeton, Autonomous Vehicle Expert January 12, 2016 An Nvidia demo in pedestrian detection combined a Quanergy LIDAR and Nvidia cameras. In the demo, they had water jets able to simulate rain, in which case it was the vision that failed and the LIDAR which kept detecting the pedestrians. Quanergy s LIDAR looks at the returns from objects and is able to tell returns from raindrops (which are more dispersed) from returns off of solid objects. 25

3D Composite Point Cloud Color Coding: Height 26

3D Composite Point Cloud Color Coding: Height 27

3D Composite Point Cloud Color Coding: Reflectivity (Intensity vs. Distance) 28

Global Positioning of Point Clouds Overlay of LiDAR Point Cloud on Satellite Imagery 29

Point Cloud Library (PCL) Steps 3D Object Perception Pipeline Clustering Object Classification 30

Sensing and Perception Context Car Motion Model Camera Sensor Data Navigation Map 3D - Local Map 3D - Global Map LiDAR data IMU / GPS data Quanergy Perception Software Vehicle data Tracked Objects 31

Quanergy 3D Perception Software Data Formatting Data Filtering Ground Plane Removal Object Detection Object Clustering Classification Tracking Outputs: 5-30Hz PCL point cloud PCL of clustered objects Object list with boundaries Object tracks Formats: PCL/ROS (today) ADTF (roadmap) 32

LiDAR Software / Global 3D-Map QUANERGY SYSTEM PLATFORM MAP FEEDBACK QUANERGY LiDAR MAP CLOUD LiDAR SW PROCESSOR SENSORS MAP ASSIST 33

Some Partners 34

Q&A Quanergy Sites: Silicon Valley (HQ) Detroit Ottawa Stuttgart Dubai Shanghai Tokyo Louay Eldada +1-408-245-9500 louay.eldada@quanergy.com www.quanergy.com Quanergy Systems, Inc. Proprietary Rights Statement This document contains proprietary information belonging to Quanergy Systems, Inc.. This document is protected by copyright and trademark laws and is not to be used, in whole or in part, in any way without the express written consent of Quanergy Systems, Inc. 35