Sensor technology for mobile robots

Size: px
Start display at page:

Download "Sensor technology for mobile robots"

Transcription

1 Laser application, vision application, sonar application and sensor fusion

2 Outline Introduction Mobile robots perception Definitions Sensor classification Sensor Performance A closer look at Ultrasonic sensors Laser rangefinders Vision technology Sensor Fusion

3 Outline Introduction Mobile robots perception Definitions Sensor classification Sensor Performance A closer look at Ultrasonic sensors Laser rangefinders Vision technology Sensor Fusion

4 Introduction Mobile robots perception A robot without perception is not a robot To make reasonable assumptions as basis for its decisions, a robot must be able to observe its environment This must be done using a number of sensors

5 Introduction Sensor classification Lots of different sensors exist...

6 Introduction Sensor classification...to measure lots of values: -... temperature sound amplitude battery voltage distance light intensity wheel load position air-pressure motor speed orientation physical contact wheel position speed...

7 Introduction Sensor classification proprioceptive exteroceptive active passive

8 Introduction Sensor classification proprioceptive exteroceptive active wheel/motor sensors: - optical encoders, - magnetic encoders, - inductive encoders, ultrasonic sensors, - laser rangefinder, - doppler radar, - GPS passive - gyroscopes, - potentiometers, - brush encoders - compass, - CCD/CMOS cameras

9 Sensor performance Basic sensor performance Dynamic range Resolution Frequency In situ sensor performance Sensitivity / cross sensitivity Systematic error Random error Precision / accuracy

10 Sensor performance Basic sensor performance Dynamic range Ratio between lower and upper limits of input values to the sensor Resolution Frequency In situ sensor performance Sensitivity / cross sensitivity Systematic error Random error Precision / accuracy

11 Sensor performance Basic sensor performance Dynamic range Ratio between lower and upper limits of input values to the sensor Resolution Minimum difference between two values that can be detected by a sensor Frequency In situ sensor performance Sensitivity / cross sensitivity Systematic error Random error Precision / accuracy

12 Sensor performance Basic sensor performance Dynamic range Ratio between lower and upper limits of input values to the sensor Resolution Minimum difference between two values that can be detected by a sensor Frequency Speed with which a sensor can provide a stream of readings In situ sensor performance Sensitivity / cross sensitivity Systematic error Random error Precision / accuracy

13 Outline Introduction Mobile robots perception Definitions Sensor classification Sensor Performance A closer look at Ultrasonic sensors Laser rangefinders Vision technology Sensor Fusion

14 Ultrasonic sensor d =c t d c t = distance travelled (rtt) = speed of wave propagation = time of flight

15 Ultrasonic sensor Advantages Ultrasonic sensors are: Very cheap Independent from external signals Usable without complex preprocessing Used in almost every mobile robot

16 Ultrasonic sensor Disadvantages / Problems Sensor measures regions of constant depth instead of points Low frequency Mean distance to object = 3m Wave speed ~ 434 m/s 20 ms per value Imagine 20 sensors around the robot, measuring sequentially to minimize interference... frequency = 2.5 Hz

17 Ultrasonic sensor Disadvantages / Problems Threshold: triggering incoming sound wave as valid echo Acoustically reflective material Reflection on surfaces angled with respect to the face of the sensor

18 Laser rangefinder 3 ways to measure time-of-flight: Pulsed laser Frequency modulated continuous waves Phase shift measurement

19 Laser rangefinder Phase-shift measurement Sensor transmits amplitude modulated light Known modulation frequency (e.g. 5 MHz)

20 Laser rangefinder Phase-shift measurement Sensor transmits amplitude modulated light Known modulation frequency (e.g. 5 MHz) Measuring of the phase-shift between transmitted and reflected signal Computing the distance using c = f c = speed of light f = modulating frequency = wavelength of the modulation

21 Laser rangefinder

22 Laser rangefinder Advantages High frequency Again: mean distance to object = 3m Wave speed ~ km/s ms per value Theoretical frequency ~ 50 MHz Sensor measures points instead of regions

23 Laser rangefinder Disadvantages / Problems Expensive Reflection of the laser-beam on highly polished surfaces Cannot detect transparent materials (glass)

24 Vision technology Vision is the most powerful sense Provides a machine with the same information as a human uses for interaction with the environment It is a long way from raw-data to useful information

25 Vision technology Sensors CCD (charged coupled device) CMOS (complementary metal oxide semiconductor)

26 Vision technology Sensors CCD CCD (charged coupled device) CMOS (complementary metal oxide semiconductor)

27 Vision technology Sensors CMOS CCD (charged coupled device) CMOS (complementary metal oxide semiconductor) z

28 Vision technology visual ranging Depth is a very important information in mobile robotics applications Any vision chip collapses 3D world to a 2D image Additional information is needed: Stereo vision Motion Changing the cameras geometry

29 Vision technology visual ranging Depth from focus Depth from stereo vision

30 Vision technology visual ranging Depth from focus = f d e Depth from stereo vision

31 Vision technology visual ranging Depth from focus Depth from stereo vision

32 Outline Introduction Mobile robots perception Definitions Sensor classification Sensor Performance A closer look at Ultrasonic sensors Laser rangefinders Vision technology Sensor Fusion

33 Sensor fusion Every sensor has different advantages / disadvantages Different sensors for different aspects of one task sensor fusion

34 Sensor fusion

35 Sensor fusion

36 Sensor fusion

37 Sensor fusion Thank you for your attention!

Sensor Modalities. Sensor modality: Different modalities:

Sensor Modalities. Sensor modality: Different modalities: Sensor Modalities Sensor modality: Sensors which measure same form of energy and process it in similar ways Modality refers to the raw input used by the sensors Different modalities: Sound Pressure Temperature

More information

Range Sensors (time of flight) (1)

Range Sensors (time of flight) (1) Range Sensors (time of flight) (1) Large range distance measurement -> called range sensors Range information: key element for localization and environment modeling Ultrasonic sensors, infra-red sensors

More information

10/5/09 1. d = 2. Range Sensors (time of flight) (2) Ultrasonic Sensor (time of flight, sound) (1) Ultrasonic Sensor (time of flight, sound) (2) 4.1.

10/5/09 1. d = 2. Range Sensors (time of flight) (2) Ultrasonic Sensor (time of flight, sound) (1) Ultrasonic Sensor (time of flight, sound) (2) 4.1. Range Sensors (time of flight) (1) Range Sensors (time of flight) (2) arge range distance measurement -> called range sensors Range information: key element for localization and environment modeling Ultrasonic

More information

Outline Sensors. EE Sensors. H.I. Bozma. Electric Electronic Engineering Bogazici University. December 13, 2017

Outline Sensors. EE Sensors. H.I. Bozma. Electric Electronic Engineering Bogazici University. December 13, 2017 Electric Electronic Engineering Bogazici University December 13, 2017 Absolute position measurement Outline Motion Odometry Inertial systems Environmental Tactile Proximity Sensing Ground-Based RF Beacons

More information

DD2426 Robotics and Autonomous Systems Lecture 4: Robot Sensors and Perception

DD2426 Robotics and Autonomous Systems Lecture 4: Robot Sensors and Perception DD2426 Robotics and Autonomous Systems Lecture 4: Robot Sensors and Perception Patric Jensfelt Kungliga Tekniska Högskolan patric@kth.se April 8,2008 Example: Robots and sensors B21 (Real world interfaces)

More information

Indoor Mobile Robot Navigation and Obstacle Avoidance Using a 3D Camera and Laser Scanner

Indoor Mobile Robot Navigation and Obstacle Avoidance Using a 3D Camera and Laser Scanner AARMS Vol. 15, No. 1 (2016) 51 59. Indoor Mobile Robot Navigation and Obstacle Avoidance Using a 3D Camera and Laser Scanner Peter KUCSERA 1 Thanks to the developing sensor technology in mobile robot navigation

More information

ECE497: Introduction to Mobile Robotics Lecture 3

ECE497: Introduction to Mobile Robotics Lecture 3 ECE497: Introduction to Mobile Robotics Lecture 3 Dr. Carlotta A. Berry Spring 06-07 PERCEPTION 1 Quote of the Week Just as some newborn race of superintelligent robots are about to consume all humanity,

More information

CS283: Robotics Fall 2016: Sensors

CS283: Robotics Fall 2016: Sensors CS283: Robotics Fall 2016: Sensors Sören Schwertfeger / 师泽仁 ShanghaiTech University Robotics ShanghaiTech University - SIST - 23.09.2016 2 REVIEW TRANSFORMS Robotics ShanghaiTech University - SIST - 23.09.2016

More information

Interaction with the Physical World

Interaction with the Physical World Interaction with the Physical World Methods and techniques for sensing and changing the environment Light Sensing and Changing the Environment Motion and acceleration Sound Proximity and touch RFID Sensors

More information

Using infrared proximity sensors for close 2D localization and object size recognition. Richard Berglind Neonode

Using infrared proximity sensors for close 2D localization and object size recognition. Richard Berglind Neonode Using infrared proximity sensors for close 2D localization and object size recognition Richard Berglind Neonode Outline Overview of sensor types IR proximity sensors and their drawbacks Principles of a

More information

TYPES OF CAMERAS IN CCTV

TYPES OF CAMERAS IN CCTV TYPES OF CAMERAS IN CCTV 1. FIXED AND PTZ CAMERAS: CCTV cameras can be fixed, have pan, tilt and can have zoom capabilities. Fixed cameras are mounted on fixed brackets and cannot move. PTZ cameras are

More information

Operation of machine vision system

Operation of machine vision system ROBOT VISION Introduction The process of extracting, characterizing and interpreting information from images. Potential application in many industrial operation. Selection from a bin or conveyer, parts

More information

ROBOT SENSORS. 1. Proprioceptors

ROBOT SENSORS. 1. Proprioceptors ROBOT SENSORS Since the action capability is physically interacting with the environment, two types of sensors have to be used in any robotic system: - proprioceptors for the measurement of the robot s

More information

Sensor Networks for Structural Monitoring: Status, Plans, Problems. Ananth Grama

Sensor Networks for Structural Monitoring: Status, Plans, Problems. Ananth Grama Sensor Networks for Structural Monitoring: Status, Plans, Problems Ananth Grama Goal Designing sensing infrastructure for real-time, physical measurement retrieval high fidelity Test infrastructure: three

More information

Complex Sensors: Cameras, Visual Sensing. The Robotics Primer (Ch. 9) ECE 497: Introduction to Mobile Robotics -Visual Sensors

Complex Sensors: Cameras, Visual Sensing. The Robotics Primer (Ch. 9) ECE 497: Introduction to Mobile Robotics -Visual Sensors Complex Sensors: Cameras, Visual Sensing The Robotics Primer (Ch. 9) Bring your laptop and robot everyday DO NOT unplug the network cables from the desktop computers or the walls Tuesday s Quiz is on Visual

More information

CSE-571 Robotics. Sensors for Mobile Robots. Beam-based Sensor Model. Proximity Sensors. Probabilistic Sensor Models. Beam-based Scan-based Landmarks

CSE-571 Robotics. Sensors for Mobile Robots. Beam-based Sensor Model. Proximity Sensors. Probabilistic Sensor Models. Beam-based Scan-based Landmarks Sensors for Mobile Robots CSE-57 Robotics Probabilistic Sensor Models Beam-based Scan-based Landmarks Contact sensors: Bumpers Internal sensors Accelerometers (spring-mounted masses) Gyroscopes (spinning

More information

Outline. ETN-FPI Training School on Plenoptic Sensing

Outline. ETN-FPI Training School on Plenoptic Sensing Outline Introduction Part I: Basics of Mathematical Optimization Linear Least Squares Nonlinear Optimization Part II: Basics of Computer Vision Camera Model Multi-Camera Model Multi-Camera Calibration

More information

Laser sensors. Transmitter. Receiver. Basilio Bona ROBOTICA 03CFIOR

Laser sensors. Transmitter. Receiver. Basilio Bona ROBOTICA 03CFIOR Mobile & Service Robotics Sensors for Robotics 3 Laser sensors Rays are transmitted and received coaxially The target is illuminated by collimated rays The receiver measures the time of flight (back and

More information

Perception: Sensors. Autonomous Mobile Robots. Davide Scaramuzza Margarita Chli, Paul Furgale, Marco Hutter, Roland Siegwart

Perception: Sensors. Autonomous Mobile Robots. Davide Scaramuzza Margarita Chli, Paul Furgale, Marco Hutter, Roland Siegwart ASL Autonomous Systems Lab Perception: Sensors Autonomous Mobile Robots Davide Scaramuzza Margarita Chli, Paul Furgale, Marco Hutter, Roland Siegwart Autonomous Mobile Robots Margarita Chli, Paul Furgale,

More information

Probabilistic Robotics

Probabilistic Robotics Probabilistic Robotics Probabilistic Motion and Sensor Models Some slides adopted from: Wolfram Burgard, Cyrill Stachniss, Maren Bennewitz, Kai Arras and Probabilistic Robotics Book SA-1 Sensors for Mobile

More information

Depth Sensors Kinect V2 A. Fornaser

Depth Sensors Kinect V2 A. Fornaser Depth Sensors Kinect V2 A. Fornaser alberto.fornaser@unitn.it Vision Depth data It is not a 3D data, It is a map of distances Not a 3D, not a 2D it is a 2.5D or Perspective 3D Complete 3D - Tomography

More information

Intelligent Robotics

Intelligent Robotics 64-424 Intelligent Robotics 64-424 Intelligent Robotics http://tams.informatik.uni-hamburg.de/ lectures/2014ws/vorlesung/ir Jianwei Zhang / Eugen Richter University of Hamburg Faculty of Mathematics, Informatics

More information

ROBOTICS AND AUTONOMOUS SYSTEMS

ROBOTICS AND AUTONOMOUS SYSTEMS ROBOTICS AND AUTONOMOUS SYSTEMS Simon Parsons Department of Computer Science University of Liverpool LECTURE 6 PERCEPTION/ODOMETRY comp329-2013-parsons-lect06 2/43 Today We ll talk about perception and

More information

Robotics and Autonomous Systems

Robotics and Autonomous Systems Robotics and Autonomous Systems Lecture 6: Perception/Odometry Terry Payne Department of Computer Science University of Liverpool 1 / 47 Today We ll talk about perception and motor control. 2 / 47 Perception

More information

Robotics and Autonomous Systems

Robotics and Autonomous Systems Robotics and Autonomous Systems Lecture 6: Perception/Odometry Simon Parsons Department of Computer Science University of Liverpool 1 / 47 Today We ll talk about perception and motor control. 2 / 47 Perception

More information

MULTI-MODAL MAPPING. Robotics Day, 31 Mar Frank Mascarich, Shehryar Khattak, Tung Dang

MULTI-MODAL MAPPING. Robotics Day, 31 Mar Frank Mascarich, Shehryar Khattak, Tung Dang MULTI-MODAL MAPPING Robotics Day, 31 Mar 2017 Frank Mascarich, Shehryar Khattak, Tung Dang Application-Specific Sensors Cameras TOF Cameras PERCEPTION LiDAR IMU Localization Mapping Autonomy Robotic Perception

More information

Camera Drones Lecture 2 Control and Sensors

Camera Drones Lecture 2 Control and Sensors Camera Drones Lecture 2 Control and Sensors Ass.Prof. Friedrich Fraundorfer WS 2017 1 Outline Quadrotor control principles Sensors 2 Quadrotor control - Hovering Hovering means quadrotor needs to hold

More information

3D Scanning. Qixing Huang Feb. 9 th Slide Credit: Yasutaka Furukawa

3D Scanning. Qixing Huang Feb. 9 th Slide Credit: Yasutaka Furukawa 3D Scanning Qixing Huang Feb. 9 th 2017 Slide Credit: Yasutaka Furukawa Geometry Reconstruction Pipeline This Lecture Depth Sensing ICP for Pair-wise Alignment Next Lecture Global Alignment Pairwise Multiple

More information

Electronic Travel Aids for Blind Guidance an Industry Landscape Study

Electronic Travel Aids for Blind Guidance an Industry Landscape Study Electronic Travel Aids for Blind Guidance an Industry Landscape Study Kun (Linda) Li EECS, UC Berkeley Dec 4 th, 2015 IND ENG 290 Final Presentation Visual Impairment Traditional travel aids are limited,

More information

ECGR4161/5196 Lecture 6 June 9, 2011

ECGR4161/5196 Lecture 6 June 9, 2011 ECGR4161/5196 Lecture 6 June 9, 2011 YouTube Videos: http://www.youtube.com/watch?v=7hag6zgj78o&feature=p layer_embedded Micro Robotics Worlds smallest robot - Version 1 - "tank" Worlds smallest robot

More information

Understand various definitions related to sensing/perception. Understand variety of sensing techniques

Understand various definitions related to sensing/perception. Understand variety of sensing techniques Sensing/Perception Objectives Understand various definitions related to sensing/perception Understand variety of sensing techniques Understand challenges of sensing and perception in robotics Motivations

More information

General Computing Concepts. Coding and Representation. General Computing Concepts. Computing Concepts: Review

General Computing Concepts. Coding and Representation. General Computing Concepts. Computing Concepts: Review Computing Concepts: Review Coding and Representation Computers represent all information in terms of numbers ASCII code: Decimal number 65 represents A RGB: (255,0,0) represents the intense red Computers

More information

Scanning Acoustic Microscopy For Metrology of 3D Interconnect Bonded Wafers

Scanning Acoustic Microscopy For Metrology of 3D Interconnect Bonded Wafers Scanning Acoustic Microscopy For Metrology of 3D Interconnect Bonded Wafers Jim McKeon, Ph.D. - Sonix, Director of Technology Sriram Gopalan, Ph.D. - Sonix, Technology Engineer 8700 Morrissette Drive 8700

More information

Localization, Where am I?

Localization, Where am I? 5.1 Localization, Where am I?? position Position Update (Estimation?) Encoder Prediction of Position (e.g. odometry) YES matched observations Map data base predicted position Matching Odometry, Dead Reckoning

More information

Old View of Perception vs. New View

Old View of Perception vs. New View Old View of Perception vs. New View Traditional ( old view ) approach: Perception considered in isolation (i.e., disembodied) Perception as king (e.g., computer vision is the problem) Universal reconstruction

More information

Range Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation

Range Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation Obviously, this is a very slow process and not suitable for dynamic scenes. To speed things up, we can use a laser that projects a vertical line of light onto the scene. This laser rotates around its vertical

More information

Reflectance & Lighting

Reflectance & Lighting Reflectance & Lighting Computer Vision I CSE5A Lecture 6 Last lecture in a nutshell Need for lenses (blur from pinhole) Thin lens equation Distortion and aberrations Vignetting CS5A, Winter 007 Computer

More information

CinCam CCD - Technical Data -

CinCam CCD - Technical Data - - Technical Data - SENSOR DATA CCD-1201 CCD-1301 CCD-2301 CCD-2302 Format: 1/2 1/3 2/3 2/3 Active area: 6.5mm x 4.8mm 4.8mm x 3.6mm 9.0mm x 6.7mm 8.5mm x 7.1mm Number of pixel: 1388 x 1038 (1.4MPixel)

More information

Non-axially-symmetric Lens with extended depth of focus for Machine Vision applications

Non-axially-symmetric Lens with extended depth of focus for Machine Vision applications Non-axially-symmetric Lens with extended depth of focus for Machine Vision applications Category: Sensors & Measuring Techniques Reference: TDI0040 Broker Company Name: D Appolonia Broker Name: Tanya Scalia

More information

Aberdeen Test Center. Non-Contact Recoil Measurements Ken McMullen ITEA Conference / Transducer Workshop 10 May 2011

Aberdeen Test Center. Non-Contact Recoil Measurements Ken McMullen ITEA Conference / Transducer Workshop 10 May 2011 Aberdeen Test Center Non-Contact Recoil Measurements Ken McMullen ITEA Conference / Transducer Workshop 10 May 2011 Half Century of Ballistic Recoil Measurements 1950s Continuous Rotating Potentiometer

More information

Camera and Inertial Sensor Fusion

Camera and Inertial Sensor Fusion January 6, 2018 For First Robotics 2018 Camera and Inertial Sensor Fusion David Zhang david.chao.zhang@gmail.com Version 4.1 1 My Background Ph.D. of Physics - Penn State Univ. Research scientist at SRI

More information

EE565:Mobile Robotics Lecture 2

EE565:Mobile Robotics Lecture 2 EE565:Mobile Robotics Lecture 2 Welcome Dr. Ing. Ahmad Kamal Nasir Organization Lab Course Lab grading policy (40%) Attendance = 10 % In-Lab tasks = 30 % Lab assignment + viva = 60 % Make a group Either

More information

Improving the 3D Scan Precision of Laser Triangulation

Improving the 3D Scan Precision of Laser Triangulation Improving the 3D Scan Precision of Laser Triangulation The Principle of Laser Triangulation Triangulation Geometry Example Z Y X Image of Target Object Sensor Image of Laser Line 3D Laser Triangulation

More information

Textbook Assignment #1: DUE Friday 5/9/2014 Read: PP Do Review Questions Pg 388 # 1-20

Textbook Assignment #1: DUE Friday 5/9/2014 Read: PP Do Review Questions Pg 388 # 1-20 Page 1 of 38 Page 2 of 38 Unit Packet Contents Unit Objectives Notes 1: Waves Introduction Guided Practice: Waves Introduction (CD pp 89-90) Independent Practice: Speed of Waves Notes 2: Interference and

More information

Servosila Robotic Heads

Servosila Robotic Heads Servosila Robotic Heads www.servosila.com TABLE OF CONTENTS SERVOSILA ROBOTIC HEADS 2 SOFTWARE-DEFINED FUNCTIONS OF THE ROBOTIC HEADS 2 SPECIFICATIONS: ROBOTIC HEADS 4 DIMENSIONS OF ROBOTIC HEAD 5 DIMENSIONS

More information

Introduction to Mobile Robotics

Introduction to Mobile Robotics Introduction to Mobile Robotics Olivier Aycard Associate Professor University of Grenoble Laboratoire d Informatique de Grenoble http://membres-liglab.imag.fr/aycard 1/29 Some examples of mobile robots

More information

Option G 1: Refraction

Option G 1: Refraction Name: Date: Option G 1: Refraction 1. The table below relates to the electromagnetic spectrum. Complete the table by stating the name of the region of the spectrum and the name of a possible source of

More information

Basilio Bona DAUIN Politecnico di Torino

Basilio Bona DAUIN Politecnico di Torino ROBOTICA 03CFIOR DAUIN Politecnico di Torino Mobile & Service Robotics Sensors for Robotics 3 Laser sensors Rays are transmitted and received coaxially The target is illuminated by collimated rays The

More information

Overview of Active Vision Techniques

Overview of Active Vision Techniques SIGGRAPH 99 Course on 3D Photography Overview of Active Vision Techniques Brian Curless University of Washington Overview Introduction Active vision techniques Imaging radar Triangulation Moire Active

More information

Spectroscopic equipment. Multispectral Imaging

Spectroscopic equipment. Multispectral Imaging Spectroscopic equipment Multispectral Imaging Basic spectroscopic arrangement Source Sample Analyzer Detector Sun Lamps Lasers LEDs Synchrotron Plants Forests Tissue Cells Flames Chemical compounds etc.

More information

Autonomous, Surveillance Fire Extinguisher Robotic Vehicle with Obstacle Detection and Bypass using Arduino Microcontroller

Autonomous, Surveillance Fire Extinguisher Robotic Vehicle with Obstacle Detection and Bypass using Arduino Microcontroller Autonomous, Surveillance Fire Extinguisher Robotic Vehicle with Obstacle Detection and Bypass using Arduino Microcontroller Sumanta Chatterjee Asst. Professor JIS College of Engineering Kalyani, WB, India

More information

Depth. Common Classification Tasks. Example: AlexNet. Another Example: Inception. Another Example: Inception. Depth

Depth. Common Classification Tasks. Example: AlexNet. Another Example: Inception. Another Example: Inception. Depth Common Classification Tasks Recognition of individual objects/faces Analyze object-specific features (e.g., key points) Train with images from different viewing angles Recognition of object classes Analyze

More information

UV-NIR LASER BEAM PROFILER

UV-NIR LASER BEAM PROFILER - Technical Data - CCD-1201 CCD-2301 CCD-2302 Standard Series Standard Series Standard Series SENSOR DATA Format: 1/2 2/3 2/3 Active area (without cover glass): 6.5mm x 4.8mm 9.0mm x 6.7mm 8.5mm x 7.1mm

More information

Collaboration is encouraged among small groups (e.g., 2-3 students).

Collaboration is encouraged among small groups (e.g., 2-3 students). Assignments Policies You must typeset, choices: Word (very easy to type math expressions) Latex (very easy to type math expressions) Google doc Plain text + math formula Your favorite text/doc editor Submit

More information

Time-of-flight basics

Time-of-flight basics Contents 1. Introduction... 2 2. Glossary of Terms... 3 3. Recovering phase from cross-correlation... 4 4. Time-of-flight operating principle: the lock-in amplifier... 6 5. The time-of-flight sensor pixel...

More information

Suitable for detecting transparent films or transparent bottles

Suitable for detecting transparent films or transparent bottles Ultrasonic Sensor Suitable for detecting transparent films or transparent bottles Reliable detection of transparent objects The sensor reliably detects transparent films or transparent objects. Only 16

More information

W4. Perception & Situation Awareness & Decision making

W4. Perception & Situation Awareness & Decision making W4. Perception & Situation Awareness & Decision making Robot Perception for Dynamic environments: Outline & DP-Grids concept Dynamic Probabilistic Grids Bayesian Occupancy Filter concept Dynamic Probabilistic

More information

Sensor based adaptive laser micromachining using ultrashort pulse lasers for zero-failure manufacturing

Sensor based adaptive laser micromachining using ultrashort pulse lasers for zero-failure manufacturing Sensor based adaptive laser micromachining using ultrashort pulse lasers for zero-failure manufacturing Fraunhofer Institute for Production Technology, Aachen M. Sc. Guilherme Mallmann Prof. Dr.-Ing. Robert

More information

High speed CMOS image sensors Wim Wuyts Sr. Staff Applications Engineer Cypress Semiconductor Corporation Belgium Vision 2006

High speed CMOS image sensors Wim Wuyts Sr. Staff Applications Engineer Cypress Semiconductor Corporation Belgium Vision 2006 High speed CMOS image sensors Wim Wuyts Sr. Staff Applications Engineer Cypress Semiconductor Corporation Belgium Vision 2006 P E R F O R M Outline Introduction Architecture Analog high speed CIS Digital

More information

The Question. What are the 4 types of interactions that waves can have when they encounter an object?

The Question. What are the 4 types of interactions that waves can have when they encounter an object? The Question What are the 4 types of interactions that waves can have when they encounter an object? Waves, Wave fronts and Rays Wave Front: Crests of the waves. Rays: Lines that are perpendicular to the

More information

MOVING OBJECT DETECTION USING BACKGROUND SUBTRACTION ALGORITHM USING SIMULINK

MOVING OBJECT DETECTION USING BACKGROUND SUBTRACTION ALGORITHM USING SIMULINK MOVING OBJECT DETECTION USING BACKGROUND SUBTRACTION ALGORITHM USING SIMULINK Mahamuni P. D 1, R. P. Patil 2, H.S. Thakar 3 1 PG Student, E & TC Department, SKNCOE, Vadgaon Bk, Pune, India 2 Asst. Professor,

More information

TABLE OF CONTENTS PRODUCT DESCRIPTION CINCAM CCD TECHNICAL DATA SENSOR RESPONSE DIMENSIONS CINCAM CCD LARGE FORMAT TECHNICAL DATA SENSOR RESPONSE

TABLE OF CONTENTS PRODUCT DESCRIPTION CINCAM CCD TECHNICAL DATA SENSOR RESPONSE DIMENSIONS CINCAM CCD LARGE FORMAT TECHNICAL DATA SENSOR RESPONSE TABLE OF CONTENTS PRODUCT DESCRIPTION CINCAM CCD TECHNICAL DATA SENSOR RESPONSE DIMENSIONS CINCAM CCD LARGE FORMAT TECHNICAL DATA SENSOR RESPONSE DIMENSIONS CINCAM CMOS TECHNICAL DATA SENSOR RESPONSE DIMENSIONS

More information

Supplier Business Opportunities on ADAS and Autonomous Driving Technologies

Supplier Business Opportunities on ADAS and Autonomous Driving Technologies AUTOMOTIVE Supplier Business Opportunities on ADAS and Autonomous Driving Technologies 19 October 2016 Tokyo, Japan Masanori Matsubara, Senior Analyst, +81 3 6262 1734, Masanori.Matsubara@ihsmarkit.com

More information

Chapters 1 5. Photogrammetry: Definition, introduction, and applications. Electro-magnetic radiation Optics Film development and digital cameras

Chapters 1 5. Photogrammetry: Definition, introduction, and applications. Electro-magnetic radiation Optics Film development and digital cameras Chapters 1 5 Chapter 1: Photogrammetry: Definition, introduction, and applications Chapters 2 4: Electro-magnetic radiation Optics Film development and digital cameras Chapter 5: Vertical imagery: Definitions,

More information

Sample 3D velocity at up to 200 Hz for use in hydraulic models and laboratory flumes

Sample 3D velocity at up to 200 Hz for use in hydraulic models and laboratory flumes Sample 3D velocity at up to 200 Hz for use in hydraulic models and laboratory flumes The is a high-resolution acoustic velocimeter used to measure 3D water velocity fluctuations within a very small sampling

More information

Development of Ultrafast CXRS system in Heliotron J. Graduate School of Energy Science Kyoto University LU XIANGXUN 03/15/2016

Development of Ultrafast CXRS system in Heliotron J. Graduate School of Energy Science Kyoto University LU XIANGXUN 03/15/2016 1 Development of Ultrafast CXRS system in Heliotron J Graduate School of Energy Science Kyoto University LU XIANGXUN 03/15/2016 2 Outline 1. Introduction 2. Charge exchange Recombination Spectroscopy (CXRS)

More information

Technical Specifications for High speed PIV and High speed PIV-PLIF system

Technical Specifications for High speed PIV and High speed PIV-PLIF system Technical Specifications for High speed PIV and High speed PIV-PLIF system MODULE A. HIGH SPEED PIV (3-C) A1. Double Cavity High Speed Laser (up to 10 khz): The vendor should provide Dual Head (DH) laser

More information

CS4495/6495 Introduction to Computer Vision

CS4495/6495 Introduction to Computer Vision CS4495/6495 Introduction to Computer Vision 9C-L1 3D perception Some slides by Kelsey Hawkins Motivation Why do animals, people & robots need vision? To detect and recognize objects/landmarks Is that a

More information

Image Formation: Light and Shading. Introduction to Computer Vision CSE 152 Lecture 3

Image Formation: Light and Shading. Introduction to Computer Vision CSE 152 Lecture 3 Image Formation: Light and Shading CSE 152 Lecture 3 Announcements Homework 1 is due Apr 11, 11:59 PM Homework 2 will be assigned on Apr 11 Reading: Chapter 2: Light and Shading Geometric image formation

More information

Technical Basis for optical experimentation Part #4

Technical Basis for optical experimentation Part #4 AerE 545 class notes #11 Technical Basis for optical experimentation Part #4 Hui Hu Department of Aerospace Engineering, Iowa State University Ames, Iowa 50011, U.S.A Light sensing and recording Lenses

More information

EE565:Mobile Robotics Lecture 3

EE565:Mobile Robotics Lecture 3 EE565:Mobile Robotics Lecture 3 Welcome Dr. Ahmad Kamal Nasir Today s Objectives Motion Models Velocity based model (Dead-Reckoning) Odometry based model (Wheel Encoders) Sensor Models Beam model of range

More information

Robotics course student reading and in-class presentation

Robotics course student reading and in-class presentation Robotics course student reading and in-class presentation Purpose: Learn how to research for information about robotics projects. Search web, library books and research papers Skim through many pages.

More information

Modelling and Simulation of the Autonomous Underwater Vehicle (AUV) Robot

Modelling and Simulation of the Autonomous Underwater Vehicle (AUV) Robot 21st International Congress on Modelling and Simulation, Gold Coast, Australia, 29 Nov to 4 Dec 2015 www.mssanz.org.au/modsim2015 Modelling and Simulation of the Autonomous Underwater Vehicle (AUV) Robot

More information

How can the body make sound bend?

How can the body make sound bend? HPP Activity 54v2 How can the body make sound bend? Exploration To this point, you ve seen or heard waves bounce off of and pass through interfaces. But what happens if a sound wave strikes an interface

More information

3D Time-of-Flight Image Sensor Solutions for Mobile Devices

3D Time-of-Flight Image Sensor Solutions for Mobile Devices 3D Time-of-Flight Image Sensor Solutions for Mobile Devices SEMICON Europa 2015 Imaging Conference Bernd Buxbaum 2015 pmdtechnologies gmbh c o n f i d e n t i a l Content Introduction Motivation for 3D

More information

The sources must be coherent. This means they emit waves with a constant phase with respect to each other.

The sources must be coherent. This means they emit waves with a constant phase with respect to each other. CH. 24 Wave Optics The sources must be coherent. This means they emit waves with a constant phase with respect to each other. The waves need to have identical wavelengths. Can t be coherent without this.

More information

E x Direction of Propagation. y B y

E x Direction of Propagation. y B y x E x Direction of Propagation k z z y B y An electromagnetic wave is a travelling wave which has time varying electric and magnetic fields which are perpendicular to each other and the direction of propagation,

More information

SMART DUSTBIN ABSTRACT

SMART DUSTBIN ABSTRACT ABSTRACT SMART DUSTBIN As people are getting smarter so are the things. While the thought comes up for Smart cities there is a requirement for Smart waste management. The idea of Smart Dustbin is for the

More information

Industrial 2D LASER SCANNER LMS-Q120ii

Industrial 2D LASER SCANNER LMS-Q120ii LMS-Q120ii Preliminary Datasheet Industrial 2D LASER SCANNER LMS-Q120ii The RIEGL LMS-Q120ii 2D - laser scanner provides accurate noncontact line scanning using a narrow infrared laser beam. The instrument

More information

1. Which diagram best represents the reflection of light from an irregular surface?

1. Which diagram best represents the reflection of light from an irregular surface? waves 6-2-04 Name 02-JUN-04 1. Which diagram best represents the reflection of light from an irregular surface? 1. 1 3. 3 2. 2 4. 4 2. In a vacuum, a monochromatic beam of light as a frequency of 6.3 X

More information

Motion estimation of unmanned marine vehicles Massimo Caccia

Motion estimation of unmanned marine vehicles Massimo Caccia Motion estimation of unmanned marine vehicles Massimo Caccia Consiglio Nazionale delle Ricerche Istituto di Studi sui Sistemi Intelligenti per l Automazione Via Amendola 122 D/O, 70126, Bari, Italy massimo.caccia@ge.issia.cnr.it

More information

Electrical Motor Controls. Chapter 11 (4 th Edition) Chapter 6 (5 th Edition)

Electrical Motor Controls. Chapter 11 (4 th Edition) Chapter 6 (5 th Edition) Electrical Motor Controls Chapter 11 (4 th Edition) Chapter 6 (5 th Edition) 1. What are the three parts of an industrial pushbutton? 2. What color legend plate is typically used for an emergency stop

More information

Measurements using three-dimensional product imaging

Measurements using three-dimensional product imaging ARCHIVES of FOUNDRY ENGINEERING Published quarterly as the organ of the Foundry Commission of the Polish Academy of Sciences ISSN (1897-3310) Volume 10 Special Issue 3/2010 41 46 7/3 Measurements using

More information

3D Computer Vision. Depth Cameras. Prof. Didier Stricker. Oliver Wasenmüller

3D Computer Vision. Depth Cameras. Prof. Didier Stricker. Oliver Wasenmüller 3D Computer Vision Depth Cameras Prof. Didier Stricker Oliver Wasenmüller Kaiserlautern University http://ags.cs.uni-kl.de/ DFKI Deutsches Forschungszentrum für Künstliche Intelligenz http://av.dfki.de

More information

A Multi-Channel Wide Range Time-to- Digital Converter with Better than 9ps RMS Precision for Pulsed Time-of-flight Laser Rangefinding

A Multi-Channel Wide Range Time-to- Digital Converter with Better than 9ps RMS Precision for Pulsed Time-of-flight Laser Rangefinding A Multi-Channel Wide Range Time-to- Digital Converter with Better than 9ps RMS Precision for Pulsed Time-of-flight Laser Rangefinding Jussi-Pekka Jansson, Antti Mäntyniemi and Juha Kostamovaara Department

More information

The topics are listed below not exactly in the same order as they were presented in class but all relevant topics are on the list!

The topics are listed below not exactly in the same order as they were presented in class but all relevant topics are on the list! Ph332, Fall 2016 Study guide for the final exam, Part Two: (material lectured before the Nov. 3 midterm test, but not used in that test, and the material lectured after the Nov. 3 midterm test.) The final

More information

UV-NIR LASER BEAM PROFILER

UV-NIR LASER BEAM PROFILER CinCam CMOS - Technical Data - CMOS-1201 CMOS-1202 CMOS-1203 CMOS-1204 Standard Series Standard Series Standard Series Standard Series SENSOR DATA Format: 1/2 1/1.8 1/1.8 1/2.5 Active area (without cover

More information

AP Physics Problems -- Waves and Light

AP Physics Problems -- Waves and Light AP Physics Problems -- Waves and Light 1. 1975-4 (Physical Optics) a. Light of a single wavelength is incident on a single slit of width w. (w is a few wavelengths.) Sketch a graph of the intensity as

More information

Lecture 19: Depth Cameras. Visual Computing Systems CMU , Fall 2013

Lecture 19: Depth Cameras. Visual Computing Systems CMU , Fall 2013 Lecture 19: Depth Cameras Visual Computing Systems Continuing theme: computational photography Cameras capture light, then extensive processing produces the desired image Today: - Capturing scene depth

More information

Visual Perception Sensors

Visual Perception Sensors G. Glaser Visual Perception Sensors 1 / 27 MIN Faculty Department of Informatics Visual Perception Sensors Depth Determination Gerrit Glaser University of Hamburg Faculty of Mathematics, Informatics and

More information

UNIT II SPECIAL SEMICONDUCTOR DEVICES

UNIT II SPECIAL SEMICONDUCTOR DEVICES UNIT II SPECIAL SEMICONDUCTOR DEVICES 1 2 3 4 5 6 7 8 9 10 11 12 13 A photodiode is a semiconductor device that converts light into an electrical current. The current is generated when photons are absorbed

More information

Reflection and Refraction of Light

Reflection and Refraction of Light PC1222 Fundamentals of Physics II Reflection and Refraction of Light 1 Objectives Investigate for reflection of rays from a plane surface, the dependence of the angle of reflection on the angle of incidence.

More information

Solid-State Hybrid LiDAR for Autonomous Driving Product Description

Solid-State Hybrid LiDAR for Autonomous Driving Product Description Solid-State Hybrid LiDAR for Autonomous Driving Product Description What is LiDAR Sensor Who is Using LiDARs How does LiDAR Work Hesai LiDAR Demo Features Terminologies Specifications What is LiDAR A LiDAR

More information

Build and Test Plan: IGV Team

Build and Test Plan: IGV Team Build and Test Plan: IGV Team 2/6/2008 William Burke Donaldson Diego Gonzales David Mustain Ray Laser Range Finder Week 3 Jan 29 The laser range finder will be set-up in the lab and connected to the computer

More information

The Concept of Sample Rate. Digitized amplitude and time

The Concept of Sample Rate. Digitized amplitude and time Data Acquisition Basics Data acquisition is the sampling of continuous real world information to generate data that can be manipulated by a computer. Acquired data can be displayed, analyzed, and stored

More information

Problem Solving 10: Double-Slit Interference

Problem Solving 10: Double-Slit Interference MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of hysics roblem Solving 10: Double-Slit Interference OBJECTIVES 1. To introduce the concept of interference. 2. To find the conditions for constructive

More information

Active Stereo Vision. COMP 4900D Winter 2012 Gerhard Roth

Active Stereo Vision. COMP 4900D Winter 2012 Gerhard Roth Active Stereo Vision COMP 4900D Winter 2012 Gerhard Roth Why active sensors? Project our own texture using light (usually laser) This simplifies correspondence problem (much easier) Pluses Can handle different

More information

Mobile Robotics. Mathematics, Models, and Methods. HI Cambridge. Alonzo Kelly. Carnegie Mellon University UNIVERSITY PRESS

Mobile Robotics. Mathematics, Models, and Methods. HI Cambridge. Alonzo Kelly. Carnegie Mellon University UNIVERSITY PRESS Mobile Robotics Mathematics, Models, and Methods Alonzo Kelly Carnegie Mellon University HI Cambridge UNIVERSITY PRESS Contents Preface page xiii 1 Introduction 1 1.1 Applications of Mobile Robots 2 1.2

More information

Depth Camera for Mobile Devices

Depth Camera for Mobile Devices Depth Camera for Mobile Devices Instructor - Simon Lucey 16-423 - Designing Computer Vision Apps Today Stereo Cameras Structured Light Cameras Time of Flight (ToF) Camera Inferring 3D Points Given we have

More information

PY106 Class31. Index of refraction. Refraction. Index of refraction. Sample values of n. Rays and wavefronts. index of refraction: n v.

PY106 Class31. Index of refraction. Refraction. Index of refraction. Sample values of n. Rays and wavefronts. index of refraction: n v. Refraction Index of refraction When an EM wave travels in a vacuum, its speed is: c = 3.00 x 10 8 m/s. In any other medium, light generally travels at a slower speed. The speed of light v in a material

More information