Optical Sensors: Key Technology for the Autonomous Car

Similar documents
Photonic Technologies for LiDAR in Autonomous/ADAS. Jake Li (Market Specialist SiPM & Automotive) Hamamatsu Corporation

OSLON IR 72 PowerFlood

Supplier Business Opportunities on ADAS and Autonomous Driving Technologies

OSLON IR 17 PowerFlood

OSLON 1 PowerStar IR. ILH-IO01-xxxx-SC201-xx Series. Product Overview. Applications. Technical Features

Advanced Driver Assistance: Modular Image Sensor Concept

Range Sensors (time of flight) (1)

Fundamental Technologies Driving the Evolution of Autonomous Driving

10/5/09 1. d = 2. Range Sensors (time of flight) (2) Ultrasonic Sensor (time of flight, sound) (1) Ultrasonic Sensor (time of flight, sound) (2) 4.1.

Datasheet. Dragon1IR PowerStars. LED Solutions. ILH-ID01-xxxx-SC201-xx series. Product Overview. Applications. Technical Features:

Solid-State Hybrid LiDAR for Autonomous Driving Product Description

SiPM and SPAD Arrays for Next Generation LiDAR

Glare Spread Function (GSF) - 12 Source Angle

OSLON IR 27 PowerFlood

Thermal and Optical Cameras. By Philip Smerkovitz TeleEye South Africa

Axis Guide to Image Usability

Indoor Mobile Robot Navigation and Obstacle Avoidance Using a 3D Camera and Laser Scanner

Option Driver Assistance. Product Information

LiDAR for ADAS and Autonomous Driving (Intro to LiDAR Face-Off at Sensors EXPO 2018)

Design Considerations And The Impact of CMOS Image Sensors On The Car

Automotive LiDAR. General Motors R&D. Ariel Lipson

IR Laser Illuminators Long Range PTZ IP HD color D/N Camera Long Range PTZ IP HD IR camera Long Range PTZ IP HD Thermal Imager Long Range PTZ IP HD Cooled thermal camera 20 km detection range Long Range PTZ IP HD Uncooled thermal camera lens 4km detection Long Range PTZ IP HD EMCCD camera Long Range PTZ IP HD video wireless up to 50 km Long Range PTZ IP HD Explosive-proof Cameras Long Range PTZ IP HD Stainless-Steel Housing Long Range PTZ IP HD Rugged Camera enclosure Long Range PTZ IP HD Solar Camera System Long Range PTZ IP HD Radar interface camera system Long Range PTZ IP HD Vehicle Mast Camera Long Range PTZ IP HD Tower long-range camera

Light is performance Opto components for projection Light is OSRAM

Advanced Driver Assistance Systems: A Cost-Effective Implementation of the Forward Collision Warning Module

Visual Perception Sensors

ESPROS Photonics Corporation

Using infrared proximity sensors for close 2D localization and object size recognition. Richard Berglind Neonode

Solid State LiDAR for Ubiquitous 3D Sensing

AUTOMOTIVE ENVIRONMENT SENSORS

VarioVIEW TM Thermal Imaging Systems

Why the Self-Driving Revolution Hinges on one Enabling Technology: LiDAR

Acuity. Acuity Sensors and Scanners. Product Brochure

Dräger UCF 6000 Thermal Imaging Camera

Adam Hammet & Anthony McClaren. Getting to know the Trimble SX10 Scanning Total Station - Inside and Out ISV Regional Conference 2017: Moama, NSW

Sensor Modalities. Sensor modality: Different modalities:

Autorama, Connecting Your Car to

Frequency-Modulated Automotive Lidar

P recise Eye. High resolution, diffraction-limited f/4.5 optical quality for high precision measurement and inspection.

Speed of light E Introduction

(0, 1, 1) (0, 1, 1) (0, 1, 0) What is light? What is color? Terminology

Research on the Measurement Method of the Detection Range of Vehicle Reversing Assisting System

Panasonic Develops Long-range TOF Image Sensor

ENABLING MOBILE INTERFACE BRIDGING IN ADAS AND INFOTAINMENT APPLICATIONS

TRAFFIC VIOLATION MANAGEMENT SOLUTION. 1

3D sensing technology for smart space

Technical Bulletin Global Vehicle Target Specification Version 1.0 May 2018 TB 025

Be sure to always check the camera is properly functioning, is properly positioned and securely mounted.

Color and Shading. Color. Shapiro and Stockman, Chapter 6. Color and Machine Vision. Color and Perception

INTRODUCTION SECURITY SERVICES BY WHITEBOX ROBOTICS

QUASI-3D SCANNING WITH LASERSCANNERS

LN-TCL1 & LN-TCL1-LRF

Computer Vision. The image formation process

Petroleum giant in the Gulf pilot module for a RF-based PERIMETER INTRUDER TRACKING SYSTEM

United Vision Offering F3.5 brightness and 60x zoom (focal length of 2000mm when used with an extender), these lenses are suited for long range surveillance, including harbor surveillance. The compact and lightweight design allows establishment of compact remote surveillance systems. Day/Night capability for high-quality imaging around the clock. Equipped with a 2x extender that can see the actions of a person at a distance of 4km. Multiple power sources supported, can use existing control systems. Iris Override function allows manual iris adjustment.

Revolutionizing 2D measurement. Maximizing longevity. Challenging expectations. R2100 Multi-Ray LED Scanner

Overview of Active Vision Techniques

2 Geek Tech Support Services 2GTSS Products

To realize Connected Vehicle Society. Yosuke NISHIMURO Ministry of Internal Affairs and Communications (MIC), Japan

Lab 5: Diffraction and Interference

Leica s Pinpoint EDM Technology with Modified Signal Processing and Novel Optomechanical Features

> Acoustical feedback in the form of a beep with increasing urgency with decreasing distance to an obstacle

SOLUTIONS FOR TESTING CAMERA-BASED ADVANCED DRIVER ASSISTANCE SYSTEMS SOLUTIONS FOR VIRTUAL TEST DRIVING

Handbook Tools for CCTV Lighting

PMD [vision] Day Vol. 3 Munich, November 18, PMD Cameras for Automotive & Outdoor Applications. ifm electronic gmbh, V.Frey. Dr.

Increased Underwater Optical Imaging Performance Via Multiple Autonomous Underwater Vehicles

Outline. ETN-FPI Training School on Plenoptic Sensing

Range Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation

Touch screen. Uses of Touch screen: Advantages of Touch screen: Disadvantages of Touch screen:

ENY-C2005 Geoinformation in Environmental Modeling Lecture 4b: Laser scanning

4G and 5G Cellular Technologies Enable Intelligent Transportation Use Cases

Modeling the Interaction of a Laser Target Detection Device with the Sea

ARM processors driving automotive innovation

Operating instructions Optical distance sensor. OID20x / / 2015

Phys 1020, Day 18: Questions? Cameras, Blmfld Reminders: Next Up: digital cameras finish Optics Note Final Project proposals next week!

Complete Barrel Measuring and Inspection System. PS Series. User s manual

Hikvision DarkFighter Technology

Automotive Testing: Optical 3D Metrology Improves Safety and Comfort


VarioCAM high resolution

Photobiological Measurement Per IEC Solutions Overview. Gooch and Housego (Orlando)

ANPR license plate recognition system USER MANUAL

Using Mobile LiDAR To Efficiently Collect Roadway Asset and Condition Data. Pierre-Paul Grondin, B.Sc. Surveying

TEL: / FAX:

Dräger UCF 8000 NFPA Certified Thermal Imaging Camera

Optical Active 3D Scanning. Gianpaolo Palma

Techniques for the Application of Radar in Off- Road Robotics

Pedestrian Detection Using Correlated Lidar and Image Data EECS442 Final Project Fall 2016

TABLE OF CONTENTS FEATURES. Read manual carefully before using this product! Features 1. Overview 2. Specifications 3. Package Contents 3

Serial to Fiber Optic Converter Technical Guide

CLASSIFICATION OF NONPHOTOGRAPHIC REMOTE SENSORS

Design and Installation Guide

website: telephone: CCTV SYSTEMS

New! 3D Smart Sensor your assistant on mobile machines. ifm.com/us/mobile

SICK AG WHITEPAPER. LiDAR SENSOR FUNCTIONALITY AND VARIANTS

ADAS: A Safe Eye on The Road

Infrared Add-On Module for Line Following Robot

Serial to Fiber Optic Converter

Latest Innovations Laser Power and Energy Measurement

Rear Camera System Owner s Manual

Advanced IP solutions enabling the autonomous driving revolution

Transcription:

Optical Sensors: Key Technology for the Autonomous Car Rajeev Thakur, P.E., Product Marketing Manager, Infrared Business Unit, Osram Opto Semiconductors Autonomously driven cars will combine a variety of technologies in the future, in order to keep an eye on their surroundings and driver at all times and respond appropriately. While fully autonomous driving is still limited to prototypes today, the first partly independently acting assistant systems are already in use on our roads. Optical sensors based on infrared lasers and LEDs are one of the key technologies for today s -- and tomorrow s -- intelligent systems to gradually ease the burden on the driver. Boredom in slow-moving traffic or on the highway will be over soon. Instead, the car will independently take us to our destination, while we relax and devote ourselves to other things. For today's road users this is still wishful thinking, but numerous prototypes and sensational test drives are proof that car manufacturers, suppliers, as well as software companies are working on concepts for autonomous driving. In order to enable the algorithms that direct the car to make correct and safe driving decisions, they must be extremely familiar with the driving situation and the car s surroundings. This data is collected by sensors, among others, usually using radars, lasers or cameras. Many of these are already used today in individual driver assistant systems as the Stop-and-Go Assistant, Parking Assistant, Lane Keeping Assistant or Emergency Brake Assistant. These systems are gradually gaining in autonomy and will initially facilitate partly automated driving, e.g. on highways, and will then lead to fully automated driving. On the sensor side, it is a question of perfectly combining the various technologies in order to cover as many functions as possible with the data and to have redundant data sources, as required due to security considerations. Environment detection using lasers Laser sensors consist of lasers and detectors and they measure distances using the light propagation time (LIDAR or Light Detection and Ranging, Figure 2). The sensor transmits a light pulse, which is reflected onto the detector by the object towards which the beam was directed. The distance between 1/7

the object and the sensor is obtained from the time the light impulse needs to reach the object and come back. The range depends on the laser power, the visibility conditions and the reflectivity of the object. One of the first applications for which LIDAR was used in the car was intelligent cruise control, which measures the distance to the car ahead and adapts its own speed accordingly. There are various types of LIDAR sensors. In the first type, a laser transmits short light pulses, which illuminate the entire scenery and use a simple detector array, usually a line, to register the spatially resolved signals. The system measures the distance to objects in the surrounding area and then uses the time sequences to ascertain how the objects move. Such laser systems typically cover an angle range of +/-4 (vertical) and +/-20 (horizontal). Laser scanners, on the other hand, scan a very wide field of vision. They direct a focused beam by using a rotating mirror and use it to scan the scenery. The bestknown example is the 360 laser scanner on the roof of the Google car. Due to design considerations, the scanners are usually integrated in the car body. An example is the Audi demonstrator vehicle that drove from San Francisco to Las Vegas to the 2015 Consumer Electronics Show and travelled autonomously on all the highways. In addition to radar sensors and video cameras, it was equipped with one laser scanner in both the radiator grille and the rear. Laser scans generally supply a scatter plot of the surrounding area, where every pixel is associated with the actual distance to the car. With their high angle resolution of less than one degree, laser scanners make it possible to identify objects and can distinguish, for example, between a rubbish bin and a pedestrian on the edge of the road. Laser scanners also register obstacles right in front of the car - a pedestrian s legs, for example - which makes them suitable as a supplementary system to radar sensors. Lasers for LIDAR within the car LIDAR systems use infrared LEDs with short switching times and high performance. 905 Nanometers (Nm) is a common wavelength. This spectral range is hardly noticeable for humans, while the detector is still sensitive to it. The typical optical pulse rating is about 25 Watt (W). Osram was the first LED manufacturer to implement LIDAR sensors in the car with its pulse lasers more than ten years ago. To increase the performance per laser diode, Osram developed Nanostack Technology, in which three laser diodes are epitaxially stacked in one chip. For example, the impulse laser diodes SPL PL90_3 or the Smart Laser SPL LL90_3 supply an optical power of more than 75 W. The SPL LL90_3 (Figure 3) also has integrated driver electronics, which generate a high power pulse of about 50 A, thereby facilitating laser pulses with steep edges and pulse lengths of about 20 ns (Figure 3). Avalanche photo diodes (APD) or the less expensive PIN photo diodes with fast switching times of a few nanoseconds (ns) are used as detectors. Suitable surface-mountable (SMT) PIN photo diodes with the necessary high 2/7

sensitivity include the BPW 34S and SFH 2400. Automotive LIDAR systems have laser class one due to their short light pulses and are not detrimental to the human eye. The next development step will be the transition to SMT types. Laser diodes are commonly used today in push-through housings; this is due to the fact that the edge emitting laser chips require novel SMT concepts among other things. A series of additional developments is conceivable for the future, depending on the application requirement. Higher wavelengths of about 1050 nm, for example, could make it possible to improve the optical power while observing the eye safety standards. This spectral range would, however require new detector technology. LIDAR systems could also benefit from integrated sub-systems such as a combination between laser and driver electronics or between detector and ASIC, as they facilitate shorter switching times and a higher time-based measuring resolution. Camera systems with infrared auxiliary lighting Cameras form the basis of many Parking Assistants or Lane Assistants today. Intelligent image processing using camera images and videos makes it possible to capture the area surrounding the car in detail, right down to recognizing the traffic signs. The more autonomous driving becomes, the safer the interpretation of the camera images must be for decision-making. The best possible image quality is the prerequisite for this. Additional illumination of the scenery using infrared light is therefore practical at dusk or during the night. Powerful, infrared LEDs (IRED) with an 850 nm wavelength are suitable as light sources. This spectral range is well detectable for camera sensors, but hardly perceivable for humans. One of the first lighting applications for IREDs in the car were the camera-based Night Vision Assistants. They illuminate about 150 meters of road using infra-red and generate a grey-scale image, which is superimposed on a display. To achieve the required brightness, the used IREDs must delivery very high optical power and be suitable for continuous operation with high currents. Osram was one of the first providers of high performance IREDs for use in cars and has continued its chip and housing technology development work since then. Today, the Oslon Black SFH 4715A delivers about 800 mw of optical power with 1 A (Figure 4). With a typical efficiency level of 48% at 1 A, it is currently the most efficient component on the market. Its low thermal resistance of 6.5 Kelvin typically per Watt (K/W) allows for permanent operation of up to 1 A. To achieve even brighter single transmitters, Osram transferred the Nanostack Technology to IREDs, thereby implementing a dual emitter. This meant the 1 Watt mark had been cracked and the latest chip and housing generation now reaches 1370 mw of optical power in continuous operation at 1 A in the SFH 4715AS. In pulse mode, up to 3 A of current are permitted and it was possible to lower the thermal resistance to 5.5 K/W. The requirements for the reflected beam angle of IREDs differ depending on the various usage areas of the cameras in the car. This is due to the two 3/7

variants of the Oslon Black. The 90 lens interacts particularly well with external lenses that shape the beam for the respective application. The 150 version is used to illuminate a large area in the vicinity and is ideal for implementing tight light cones and long ranges using reflector-based lenses. Cameras The use of cameras will also increase within the car interior. An automatically steered car must know what the driver is involved in so that it can direct its attention to the traffic on time, before it hands the control back to him in certain situations. Today, the so-called Alertness Assistants especially monitor whether the driver is becoming tired. This occurs, for example, by analysing his steering and pedal movements and by using the front camera to detect a typical drifting between the lanes. If one watches the driver s face with the camera, the blink frequency can be ascertained to recognise the onset of fatigue. Likewise, determining the viewing direction allows one to reach conclusions about the current attentiveness, for example, whether the driver is looking ahead at the road or if he is just distracted, in the event of danger. The interior of the vehicle is not bright enough to be able to obtain a high image quality at all times of the day solely with the available light. This is why additional infra-red illumination is required. So as not to distract the driver, a longer wave transmitter with 940 nm is used today instead of IREDs with 850 nm whose light is still perceived in the dark as a weak red beam. The requirements with regard to optical power in continuous operation are high, as a relatively large area has to be illuminated with the smallest possible lighting units. A suitable IRED with 940 nm, which has qualified for automobile applications, is the Oslon Black SFH 4725S / SFH 4726S and has a reflected beam angle of 90 or 150. It typically delivers 990 mw of optical power with 1 A. Conclusions The self-driving car of the future must be very well informed about its surroundings and its driver, in order to safely make the correct driving decisions. Optical technologies are playing an important role in capturing this information. When developing new sensors, a high dynamic can be observed, in order to measure additional details, and an increasing sensor fusion, that is, the merging of systems that largely worked individually in the past, in order to implement additional functions. This has resulted in new requirements with regard to the field of vision involved, the wavelengths, or the required optical power for the light sources. Continuous development of light sources for optical sensors will play an important role on the road to automated, self-driving vehicles. The previously separate assistance systems will be merged into larger complete systems, changing the requirements that the separate sensors have to meet. 4/7

Illustrations: Figure 1: All-round visibility for autonomous driving: In addition to radar, the measurements from cameras and laser sensors, in particular, will make autonomous driving possible in future. Figure 2: Laser-Radar (LIDAR): A transmitted light pulse (green) is reflected onto an object - in this case the car in front - onto the detector. The distance between the object and car is calculated from the propagation time of the light pulse. 5/7

Figure 3: LIDAR systems have been used in cars for more than ten years: The pulse laser diode SPL LL90_3 is equipped with an integrated driver to generate short pulses using high currents. Figure 4: Infra-red light for good camera images: Oslon Black is currently one of the most powerful infrared LEDs with an 850 nm wavelength. It delivers 800 mw of optical power (SFH4715A) with 1 A of current or even 1370 mw in the stack version (SFH 4715AS). 6/7

7/7