Sensor Modalities. Sensor modality: Different modalities:

Similar documents
Range Sensors (time of flight) (1)

10/5/09 1. d = 2. Range Sensors (time of flight) (2) Ultrasonic Sensor (time of flight, sound) (1) Ultrasonic Sensor (time of flight, sound) (2) 4.1.

Sensor technology for mobile robots

Localization and Map Building

CS283: Robotics Fall 2016: Sensors

Outline Sensors. EE Sensors. H.I. Bozma. Electric Electronic Engineering Bogazici University. December 13, 2017

DD2426 Robotics and Autonomous Systems Lecture 4: Robot Sensors and Perception

Localization, Where am I?

Perception: Sensors. Autonomous Mobile Robots. Davide Scaramuzza Margarita Chli, Paul Furgale, Marco Hutter, Roland Siegwart

Localization and Map Building

ROBOTICS AND AUTONOMOUS SYSTEMS

Exam in DD2426 Robotics and Autonomous Systems

Robotics and Autonomous Systems

Robotics and Autonomous Systems

Probabilistic Robotics

Uncertainties: Representation and Propagation & Line Extraction from Range data

Using infrared proximity sensors for close 2D localization and object size recognition. Richard Berglind Neonode

Complex Sensors: Cameras, Visual Sensing. The Robotics Primer (Ch. 9) ECE 497: Introduction to Mobile Robotics -Visual Sensors

Laser sensors. Transmitter. Receiver. Basilio Bona ROBOTICA 03CFIOR

Range Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation

MULTI-MODAL MAPPING. Robotics Day, 31 Mar Frank Mascarich, Shehryar Khattak, Tung Dang

At the interface between two materials, where light can be reflected or refracted. Within a material, where the light can be scattered or absorbed.

Outline. ETN-FPI Training School on Plenoptic Sensing

CSE-571 Robotics. Sensors for Mobile Robots. Beam-based Sensor Model. Proximity Sensors. Probabilistic Sensor Models. Beam-based Scan-based Landmarks

Light. Form of Electromagnetic Energy Only part of Electromagnetic Spectrum that we can really see

EE565:Mobile Robotics Lecture 3

Experiment 5: Polarization and Interference

Lecture 18: Voronoi Graphs and Distinctive States. Problem with Metrical Maps

3D Computer Vision. Depth Cameras. Prof. Didier Stricker. Oliver Wasenmüller

ECE497: Introduction to Mobile Robotics Lecture 3

Lecture 7 Notes: 07 / 11. Reflection and refraction

Visual Perception Sensors

Probabilistic Robotics

Overview of Active Vision Techniques

Perception. Autonomous Mobile Robots. Sensors Vision Uncertainties, Line extraction from laser scans. Autonomous Systems Lab. Zürich.

Zürich. Roland Siegwart Margarita Chli Martin Rufli Davide Scaramuzza. ETH Master Course: L Autonomous Mobile Robots Summary

(a) (b) (c) Fig. 1. Omnidirectional camera: (a) principle; (b) physical construction; (c) captured. of a local vision system is more challenging than

3D Modeling of Objects Using Laser Scanning

Uncertainty simulator to evaluate the electrical and mechanical deviations in cylindrical near field antenna measurement systems

Ray Optics. Lecture 23. Chapter 34. Physics II. Course website:

f. (5.3.1) So, the higher frequency means the lower wavelength. Visible part of light spectrum covers the range of wavelengths from

Interference of Light

Understand various definitions related to sensing/perception. Understand variety of sensing techniques

Indoor Mobile Robot Navigation and Obstacle Avoidance Using a 3D Camera and Laser Scanner

POME A mobile camera system for accurate indoor pose

specular diffuse reflection.

COS Lecture 13 Autonomous Robot Navigation

Active Stereo Vision. COMP 4900D Winter 2012 Gerhard Roth

Mobile Robotics. Mathematics, Models, and Methods. HI Cambridge. Alonzo Kelly. Carnegie Mellon University UNIVERSITY PRESS

Modern Robotics Inc. Sensor Documentation

Practical Robotics (PRAC)

Ray Optics. Lecture 23. Chapter 23. Physics II. Course website:

Fundamentals of Stereo Vision Michael Bleyer LVA Stereo Vision

Old View of Perception vs. New View

Introduction to Mobile Robotics Bayes Filter Particle Filter and Monte Carlo Localization. Wolfram Burgard

Optics. a- Before the beginning of the nineteenth century, light was considered to be a stream of particles.

Computer Vision Group Prof. Daniel Cremers. 11. Sampling Methods

Reflection and Refraction of Light

Physics 1C DIFFRACTION AND INTERFERENCE Rev. 2-AH. Introduction

Color and Shading. Color. Shapiro and Stockman, Chapter 6. Color and Machine Vision. Color and Perception

CS4758: Rovio Augmented Vision Mapping Project

Robot Mapping. A Short Introduction to the Bayes Filter and Related Models. Gian Diego Tipaldi, Wolfram Burgard

Transmission Media. Criteria for selection of media. Criteria for selection of media 2/17/2016. Chapter 3: Transmission Media.

Computer Vision. The image formation process

CS5670: Computer Vision

ROBOT SENSORS. 1. Proprioceptors

Time-of-flight basics

Basic Waves, Sound & Light Waves, and the E & M Spectrum

Distance Sensors: Sound, Light and Vision DISTANCE SENSORS: SOUND, LIGHT AND VISION - THOMAS MAIER 1

Stereo Vision. MAN-522 Computer Vision

ME 597/747 Autonomous Mobile Robots. Mid Term Exam. Duration: 2 hour Total Marks: 100

Practical Course WS12/13 Introduction to Monte Carlo Localization

Module 5: Video Modeling Lecture 28: Illumination model. The Lecture Contains: Diffuse and Specular Reflection. Objectives_template

IMPROVED LASER-BASED NAVIGATION FOR MOBILE ROBOTS

Basilio Bona DAUIN Politecnico di Torino

Chapter 18 Ray Optics

Depth Sensors Kinect V2 A. Fornaser

CENG 477 Introduction to Computer Graphics. Ray Tracing: Shading

Lighting. To do. Course Outline. This Lecture. Continue to work on ray programming assignment Start thinking about final project

EE565:Mobile Robotics Lecture 2

All forms of EM waves travel at the speed of light in a vacuum = 3.00 x 10 8 m/s This speed is constant in air as well

MEASUREMENT OF THE WAVELENGTH WITH APPLICATION OF A DIFFRACTION GRATING AND A SPECTROMETER

Lecture 24 EM waves Geometrical optics

Observation Screen. Introduction

Visualisatie BMT. Rendering. Arjan Kok

Interference of Light

Diffraction. Single-slit diffraction. Diffraction by a circular aperture. Chapter 38. In the forward direction, the intensity is maximal.

3D scanning. 3D scanning is a family of technologies created as a means of automatic measurement of geometric properties of objects.

SUMMARY: DISTINCTIVE IMAGE FEATURES FROM SCALE- INVARIANT KEYPOINTS

Correspondence and Stereopsis. Original notes by W. Correa. Figures from [Forsyth & Ponce] and [Trucco & Verri]

Final Exam Practice Fall Semester, 2012

Interference of Light

Major project components: Sensors Robot hardware/software integration Kinematic model generation High-level control

DESIGNER S NOTEBOOK Proximity Detection and Link Budget By Tom Dunn July 2011

Ultrasonic Multi-Skip Tomography for Pipe Inspection

This chapter explains two techniques which are frequently used throughout

Semantic Mapping and Reasoning Approach for Mobile Robotics

Chapter 12 Notes: Optics

Today. Global illumination. Shading. Interactive applications. Rendering pipeline. Computergrafik. Shading Introduction Local shading models

Wallace Hall Academy

Transcription:

Sensor Modalities Sensor modality: Sensors which measure same form of energy and process it in similar ways Modality refers to the raw input used by the sensors Different modalities: Sound Pressure Temperature Light Visible light Infrared light X-rays Etc.

4.1.1 Classification of Sensors What: Proprioceptive sensors measure values internally to the system (robot), e.g. motor speed, wheel load, heading of the robot, battery status Exteroceptive sensors How: information from the robots environment distances to objects, intensity of the ambient light, unique features. Passive sensors energy coming for the environment Active sensors emit their proper energy and measure the reaction better performance, but some influence on environment

General Classification (1) 4.1.1

General Classification (2) 4.1.1

Characterizing Sensor Performance 4.1.2 Basic sensor response ratings Range lower and upper limits Resolution minimum difference between two values Linearity variation of output signal as function of the input signal Bandwidth or Frequency the speed with which a sensor can provide a stream of readings usually there is an upper limit depending on the sensor and the sampling rate lower limit is also possible, e.g. acceleration sensor one also has to consider signal delay

In Situ Sensor Performance (1) 4.1.2 Characteristics that are especially relevant for real world environments Sensitivity ratio of output change to input change however, in real world environment, the sensor has very often high sensitivity to other environmental changes, e.g. illumination Cross-sensitivity sensitivity to environmental parameters that are orthogonal to the target parameters influence of other active sensors Error / Accuracy difference between the sensor s output and the true value error m = measured value v = true value

4.1.2 In Situ Sensor Performance (2) Characteristics that are especially relevant for real world environments Systematic error -> deterministic errors caused by factors that can (in theory) be modeled -> prediction e.g. calibration of a laser sensor or of the distortion cause by the optic of a camera Random error -> non-deterministic no prediction possible however, they can be described probabilistically e.g. Hue instability of camera, black level noise of camera Precision reproducibility of sensor results

4.1.2 Characterizing Error: The Challenges in Mobile Robotics Mobile Robot has to perceive, analyze and interpret the state of the surrounding Measurements in real world environment are dynamically changing and error prone. Examples: changing illuminations specular reflections light or sound absorbing surfaces cross-sensitivity of robot sensor to robot pose and robot-environment dynamics rarely possible to model -> appear as random errors systematic errors and random errors might be well defined in controlled environment. This is not the case for mobile robots!!

4.1.2 Multi-Modal Error Distributions: The Challenges in Behavior of sensors modeled by probability distribution (random errors) usually very little knowledge about the causes of random errors often probability distribution is assumed to be symmetric or even Gaussian however, it is important to realize how wrong this can be! Examples: Sonar (ultrasonic) sensor might overestimate the distance in real environment and is therefore not symmetric Thus the sonar sensor might be best modeled by two modes: - mode for the case that the signal returns directly - mode for the case that the signals returns after multi-path reflections. Stereo vision system might correlate to images incorrectly, thus causing results that make no sense at all

Proximity Sensors Measure relative distance (range) between sensor and objects in environment Most proximity sensors are active Common Types: Sonar (ultrasonics) Infrared (IR) Bump and feeler sensors

4.1.6 Range Sensors (time of flight) (1) Large range distance measurement -> called range sensors Range information: key element for localization and environment modeling Ultrasonic sensors as well as laser range sensors make use of propagation speed of sound or electromagnetic waves respectively. The traveled distance of a sound or electromagnetic wave is given by Where d = c. t d = distance traveled (usually round-trip) c = speed of wave propagation t = time of flight.

Range Sensors (time of flight) (2) 4.1.6 It is important to point out Propagation speed v of sound: 0.3 m/ms Propagation speed v of of electromagnetic signals: 0.3 m/ns, one million times faster. 3 meters is 10 ms ultrasonic system only 10 ns for a laser range sensor time of flight t with electromagnetic signals is not an easy task laser range sensors expensive and delicate The quality of time of flight range sensors manly depends on: Uncertainties about the exact time of arrival of the reflected signal Inaccuracies in the time of fight measure (laser range sensors) Opening angle of transmitted beam (ultrasonic range sensors) Interaction with the target (surface, specular reflections) Variation of propagation speed Speed of mobile robot and target (if not at stand still)

4.1.6 Laser Range Sensor (time of flight, electromagnetic) Transmitter D P L Target Phase Measurement Transmitted Beam Reflected Beam Transmitted and received beams coaxial Transmitter illuminates a target with a collimated beam Received detects the time needed for round-trip A mechanical mechanism with a mirror sweeps 2 or 3D measurement

4.1.6 Laser Range Sensor (time of flight, electromagnetic) Confidence in the range is inversely proportional to the square of the received signal amplitude. Hence dark, distant objects will not produce such good range estimated as closer brighter objects

Laser Range Sensor (time of flight, electromagnetic) 4.1.6 Typical range image of a 2D laser range sensor with a rotating mirror. The length of the lines through the measurement points indicate the uncertainties.

4.2 Uncertainty Representation Sensing is always related to uncertainties. What are the sources of uncertainties? How can uncertainty be represented or quantified? How do they propagate - uncertainty of a function of uncertain values? How do uncertainties combine if different sensor reading are fused? What is the merit of all this for mobile robotics? Some definitions: Sensitivity: G=out/in Resolution: Smallest change which can be detected Dynamic Range: value max / resolution (10 4-10 6 ) Accuracy: error max = (measured value) - (true value) Errors are usually unknown: deterministic non deterministic (random)

Uncertainty Representation 4.2 Statistical representation and independence of random variables

Gaussian Distribution 4.2.1 0.4-2 -1 1 2

4.2.2 The Error Propagation Law: Motivation Imagine extracting a line based on point measurements with uncertainties. The model parameters ρ i (length of the perpendicular) and θ i (its angle to the abscissa) describe a line uniquely. The question: r α x i = (ρ i, θ i ) What is the uncertainty of the extracted line knowing the uncertainties of the measurement points that contribute to it?

Feature Extraction - Scene Interpretation 4.3 Environment sensing signal treatment feature extraction scene interpretation A mobile robot must be able to determine its relationship to the environment by sensing and interpreting the measured signals. A wide variety of sensing technologies are available as we have seen in previous section. However, the main difficulty lies in interpreting these data, that is, in deciding what the sensor signals tell us about the environment. Choice of sensors (e.g. indoor, outdoor, walls, free space ) Choice of the environment model

4.3 Features Features are distinctive elements or geometric primitives of the environment. They usually can be extracted from measurements and mathematically described. low-level features (geometric primitives) like lines, circles high-level features like edges, doors, tables or trash cans. In mobile robotics features help for localization and map building.

Environment Representation and Modeling Features 4.3 Environment Representation Continuous Metric Discrete Metric Discrete Topological x,y,θ metric grid topological grid Environment Modeling Raw sensor data, e.g. laser range data, grayscale images large volume of data, low distinctiveness makes use of all acquired information Low level features, e.g. line other geometric features medium volume of data, average distinctiveness filters out the useful information, still ambiguities High level features, e.g. doors, a car, the Eiffel tower low volume of data, high distinctiveness filters out the useful information, few/no ambiguities, not enough information

4.3 Environment Models: Examples A: Feature based Model B: Occupancy Grid

Feature extraction based on range images 4.3.1 Geometric primitives like line segments, circles, corners, edges For most other geometric primitives, the parametric description of the features becomes too complex, and no closed form solutions exist. However, lines segments are very often sufficient to model the environment, especially for indoor applications.

Features Based on Range Data: Line Extraction (1) 4.3.1 Least Squares Weighted Least Squares

4.3.1 Features Based on Range Data: Line Extraction (2) 17 measurements error (σ) proportional to ρ 2 weighted least squares:

Segmentation for Line Extraction 4.3.1