Electronic Travel Aids for Blind Guidance an Industry Landscape Study

Similar documents
Sensor technology for mobile robots

Solid-State Hybrid LiDAR for Autonomous Driving Product Description

Outline Sensors. EE Sensors. H.I. Bozma. Electric Electronic Engineering Bogazici University. December 13, 2017

Outline. ETN-FPI Training School on Plenoptic Sensing

DISTANCE MEASUREMENT USING STEREO VISION

Laser Eye a new 3D sensor for active vision

Using infrared proximity sensors for close 2D localization and object size recognition. Richard Berglind Neonode

Sensor Fusion: Potential, Challenges and Applications. Presented by KVH Industries and Geodetics, Inc. December 2016

3D Scanning. Qixing Huang Feb. 9 th Slide Credit: Yasutaka Furukawa

DD2426 Robotics and Autonomous Systems Lecture 4: Robot Sensors and Perception

Indoor Mobile Robot Navigation and Obstacle Avoidance Using a 3D Camera and Laser Scanner

Visual Perception Sensors

Lecture 19: Depth Cameras. Visual Computing Systems CMU , Fall 2013

3D scanning. 3D scanning is a family of technologies created as a means of automatic measurement of geometric properties of objects.

TERRAIN ANALYSIS FOR BLIND WHEELCHAIR USERS: COMPUTER VISION ALGORITHMS FOR FINDING CURBS AND OTHER NEGATIVE OBSTACLES

ESPROS Photonics Corporation

Towards Autonomous Vehicle. What is an autonomous vehicle? Vehicle driving on its own with zero mistakes How? Using sensors

MULTI-MODAL MAPPING. Robotics Day, 31 Mar Frank Mascarich, Shehryar Khattak, Tung Dang

MotionCam-3D. The highest resolution and highest accuracy area based 3D camera in the world. Jan Zizka, CEO AWARD 2018

WHITEPAPER. Intel RealSense Brings 3D Vision to Robots

Toward Spatial Information Awareness Map (SIAM) for the Visually Impaired

Human Detection. A state-of-the-art survey. Mohammad Dorgham. University of Hamburg

High-Fidelity Augmented Reality Interactions Hrvoje Benko Researcher, MSR Redmond

People Tracking for Enabling Human-Robot Interaction in Large Public Spaces

Distance Sensors: Sound, Light and Vision DISTANCE SENSORS: SOUND, LIGHT AND VISION - THOMAS MAIER 1

SMART OBJECT DETECTOR FOR VISUALLY IMPAIRED

Range Sensors (time of flight) (1)

Computer Vision. 3D acquisition

Multi-View Stereo for Community Photo Collections Michael Goesele, et al, ICCV Venus de Milo

10/5/09 1. d = 2. Range Sensors (time of flight) (2) Ultrasonic Sensor (time of flight, sound) (1) Ultrasonic Sensor (time of flight, sound) (2) 4.1.

Camera Drones Lecture 2 Control and Sensors

LiDAR for ADAS and Autonomous Driving (Intro to LiDAR Face-Off at Sensors EXPO 2018)

ANDROID BASED OBJECT RECOGNITION INTO VOICE INPUT TO AID VISUALLY IMPAIRED

A Comparison between Active and Passive 3D Vision Sensors: BumblebeeXB3 and Microsoft Kinect

ToF Camera for high resolution 3D images with affordable pricing

Wall-Follower. Xiaodong Fang. EEL5666 Intelligent Machines Design Laboratory University of Florida School of Electrical and Computer Engineering

Active Stereo Vision. COMP 4900D Winter 2012 Gerhard Roth

Visual Perception for Robots

2 Depth Camera Assessment

OCCUPANCY GRID MODELING FOR MOBILE ROBOT USING ULTRASONIC RANGE FINDER

Ceilbot vision and mapping system

Sensor Modalities. Sensor modality: Different modalities:

Depth Sensors Kinect V2 A. Fornaser

Probabilistic Robotics

VISION BASED AUTONOMOUS SECURITY ROBOT

Dynamic Environment Exploration Using a Virtual White Cane

3D Modeling of Objects Using Laser Scanning

PMD [vision] Day Vol. 3 Munich, November 18, PMD Cameras for Automotive & Outdoor Applications. ifm electronic gmbh, V.Frey. Dr.

STUDENT MODULE 1: GET TO KNOW YOUR TOOLS BY GRAPH MATCHING WITH VERNIER MOTION DETECTORS

Advanced Driver Assistance: Modular Image Sensor Concept

Touch Less Touch Screen Technology

ROBOT SENSORS. 1. Proprioceptors

DETECTION OF 3D POINTS ON MOVING OBJECTS FROM POINT CLOUD DATA FOR 3D MODELING OF OUTDOOR ENVIRONMENTS

Digital Images. Kyungim Baek. Department of Information and Computer Sciences. ICS 101 (November 1, 2016) Digital Images 1

Geometry of Multiple views

General Computing Concepts. Coding and Representation. General Computing Concepts. Computing Concepts: Review

Interactive Virtual Environments

Why the Self-Driving Revolution Hinges on one Enabling Technology: LiDAR

A VOICE BASED PRODUCT IDENTIFICATION FOR BLIND PERSONS

Color Tracking Robot

SMART HELMET FOR VISUALLY CHALLENGED PERSON. Mohammed Saifuddin Munna, Syeda Nishat Tasnim, Mithun Kumar Nath

Complex Sensors: Cameras, Visual Sensing. The Robotics Primer (Ch. 9) ECE 497: Introduction to Mobile Robotics -Visual Sensors

An Intelligent Walking Stick for the Visually-Impaired People

Acoustic/Lidar Sensor Fusion for Car Tracking in City Traffic Scenarios

Overview of Active Vision Techniques

Thermal and Optical Cameras. By Philip Smerkovitz TeleEye South Africa

Homework Graphics Input Devices Graphics Output Devices. Computer Graphics. Spring CS4815

3D Computer Vision Introduction

Multiple View Geometry

ECGR4161/5196 Lecture 6 June 9, 2011

CS595:Introduction to Computer Vision

Mobile Robotics. Mathematics, Models, and Methods. HI Cambridge. Alonzo Kelly. Carnegie Mellon University UNIVERSITY PRESS

TomTom Innovation. Hans Aerts VP Software Development Business Unit Automotive November 2015

Active Light Time-of-Flight Imaging

IO Vision an integrated system to support the visually impaired

Are you looking for ultrafast and extremely precise stereovision technology for industrial applications? Learn about

Sense Autonomous 2_11. All rights reserved.

3D Laser Range Finder Topological sensor gathering spatial data from the surrounding environment

視覚情報処理論. (Visual Information Processing ) 開講所属 : 学際情報学府水 (Wed)5 [16:50-18:35]

POINT-CLOUD PROCESSING USING HDL CODER. April 17th 2018

Table of Contents. Introduction 1. Software installation 2. Remote control and video transmission 3. Navigation 4. FAQ 5.

Automotive LiDAR. General Motors R&D. Ariel Lipson

CS4495/6495 Introduction to Computer Vision

COS Lecture 10 Autonomous Robot Navigation

CSE-571 Robotics. Sensors for Mobile Robots. Beam-based Sensor Model. Proximity Sensors. Probabilistic Sensor Models. Beam-based Scan-based Landmarks

An Image Based Approach to Compute Object Distance

Other Reconstruction Techniques

W4. Perception & Situation Awareness & Decision making

FinePix JX

Navigation for Future Space Exploration Missions Based on Imaging LiDAR Technologies. Alexandre Pollini Amsterdam,

Depth. Common Classification Tasks. Example: AlexNet. Another Example: Inception. Another Example: Inception. Depth

TG 2 Black. Maximum toughness for extreme situations

Analysis of Obstacle Detection Technologies used in Mobile Robots

Sensing Deforming and Moving Objects with Commercial Off the Shelf Hardware

Conceptual Physics 11 th Edition

The Internet of Things: Roadmap to a Connected World. IoT and Localization

Supplier Business Opportunities on ADAS and Autonomous Driving Technologies

Agenda. Camera Selection Parameters Focal Length Field of View Iris Aperture Automatic Shutter Illumination Resolution S/N Ratio Image Sensor Lens

Measuring the World: Designing Robust Vehicle Localization for Autonomous Driving. Frank Schuster, Dr. Martin Haueis

AUTONOMOUS AND VOICE ENABLED EMBEDDED WHEEL CHAIR

Transcription:

Electronic Travel Aids for Blind Guidance an Industry Landscape Study Kun (Linda) Li EECS, UC Berkeley Dec 4 th, 2015 IND ENG 290 Final Presentation

Visual Impairment Traditional travel aids are limited, invasive, training heavy, and not social friendly Globally 285M/7.3B with VI, 39M blind 90% live in low income settings Incomplete public facilities in developing countries US 10M/320M with VI, 1.3M blind 109K VI use white canes (1.1%) Just over 7K VI use dog guides (0.07%)

Electronic Travel Aids (ETA) Sensors Functions: obstacle detection, mapping and navigation Signals received: acoustic, electrical, optical, etc. Signals translated: auditory, tactile cues, stereophonic image

Existed / Existing Products Ultrasonic Sensor (Sonar) (Primary reference: Vance Landford, Electronic travel aids ETAs, past and present, TAER April 2004.) (Image courtesy: S. Shoval et al, IEEE Tran. on Syst., 1998)

Type-I Single output for object preview, go-no-go system, secondary aid Device Time Range Price Problem Pathsounder, Russel 1966-NA 1.83 m NA Ultrasonic Cone, Mowat 1972-NA 4.02 m NA Bad weather failure Polaron, Nurion 1980-NA? 4.88 m $892 Sensory 6 NA-1994 2-3.5 m NA Head position important WalkMate 1993-NA 1.83 m NA Beam may vary outdoor Miniguide US 2004-Now 7.92 m $545

Type-II Multiple outputs for object preview, go-no-go system, secondary aid or primary tool (cane) Device Time Range Price Demo Wheelchair pathfinder, Nurion Laser Cane N- 2000 NA 2.44m forward,1.22m above head, 0.3m side, 2.44m dropoffs Still Available Guide Cane NA-1998 3.5m 3.66m 3 beams (straight, head, drop) $4500 $2650 BAT K Cane Handle NA-2003 Ultra Cane Available 2 or 4m forward, 1.6m above head $635

Type-III Dr. Leslie Kay Object preview, plus environmental information, giving text rather than headlines! Type-I, Ultra Sonic Torch, 1965, 1 st ETA product Type-II, BAT K Cane Handle Type-III Sonic Guide The concept of Type-III ETA Interpretation of tonal characteristics making primitive object identification possible

Type-IV Object preview, plus artificial intelligence Sonic Pathfinder Computer translates sonic energy to directional music notes Displays only information of practical interest, not visual picture of world Secondary aid, less training Not commercialized, research in 1996 http://members.optuszoo.com.au/aheyes40/pa/pf_blerb.html

Limitations of current sonar-based products Current available products are still secondary aids for a white cane or guide dog Limited range (~5m) and resolution (>3cm) Slow response, not for fast walking Acoustic interference and screening Large divergence, not directional No precise information about shape and motion status of obstacles Mini-guide US $545 UltraCane $635 Laser Cane $2650 Special sonar

State-of-the-art Research Infrared Sensor Camera CCD or CMOS Stereo Camera Projected-light Camera 3D LiDAR

Infrared Sensor Mechanism triangulation In addition to distance, it provides material recognition and shape analysis Range: 10cm-1.5m with 93% accuracy Response time: 39ms compared with 100-200ms for sonar (Reference: A.S. Al-Fahoum, et al, A smart microcontroller-based blind guidance, Hindawi, 2013) (Image courtesy: http://www.physicscentral.com/explore/action/infraredlight.cfm)

CCD or CMOS Camera voice!!! When webcam meets neuroscience - a whole sound picture, not just go-no-go, to truly improve quality of life Neuroscience: neural crossmodal plasticity voice software does image-to-sound rendering, through crossmodal sensory integration Creates stereophonic effect, acoustic panorama Drawbacks: limited ranging ability (check out their website for demos and papers: https://www.seeingwithsound.com/)

Stereo Camera Two lenses and sensors to simulate human binocular Camera with depth information, but limited vision, but not as good as our eyes! (Reference: V. Pradeep, et al, Robot vision for the visually impaired, IEEE confer. 2010)

Projected-light 3D Camera 2D cameras: stereo or RGB Combining the projection of a light pattern with a standard 2D camera Depth information: patterned light, triangulation Available products Ensenso N10 Microsoft Kinect (0.7-3.5m) Asus Xtion Apple PrimeSense Carmine (0.35-1.4m) Drawbacks: limited range, not suitable for outdoor

3D LiDAR Camera Need a compromised LiDAR Laser Radar, time of flight (ToF) camera at cheaper price! Product Company Approach Range Resolution FoV Price Swiss Ranger 4000 Heptagon Modulated 5-8m 176x14 pixel 43.6 x 34.6 $9K CamCube 2.0 PMD Tech. - 7m 204x204 pixel 40 x 40 $12K Puck Velodyne Pulsed, scanner TigerCub ASC3D Pulsed, flash 100m - 360 x 30 $8K ~1km - - $50K LiDAR-Lite 2 PulsedLight Point-wise 40m 1cm - $115 Sth in between? 10m 1x1 ppi 40 x 40?

Projection Global LiDAR market is expected to reach $624.9M by 2020 285M vision-impaired people, and it will make their lives a lot better! Autonomous cars and robotics markets to lead Moore s law for LiDAR?

Thank you! Email: lindakli@berkeley.edu SR4000 CamCube 2.0 Puck TigerCub LIDAR-Lite 2