POME A mobile camera system for accurate indoor pose

Size: px
Start display at page:

Download "POME A mobile camera system for accurate indoor pose"

Transcription

1 POME A mobile camera system for accurate indoor pose Paul Montgomery & Andreas Winter November All rights reserved. 1

2 ICT Intelligent Construction Tools A joint venture between Trimble and Hilti Vision: revolutionize the way construction is done Many inefficiencies persist on the construction site The central technical problem - Accurate and robust indoor positioning 11/7/2016 2

3 Overview 1. ICT and the construction market 2. A brief history of indoor positioning 3. POME concept, theory of operation, system tradeoffs 4. POME error budget 5. System components 6. The usual problems 7. POME accuracy 8. The future static kinematic 11/7/2016 3

4 Construction Site 11/7/ All rights reserved.

5 Construction tools today A precision instrument (not a tool) 5 arc sec sensor ~2.5mm at 100 m (Horizontal, Vertical) $30-60K (expensive) Requires experience to set up and use careful installation on a stable tripod correct referencing Single user Subject to line of sight occlusion Issues in tracking at close range Most sites still use traditional tools for most tasks 11/7/2016 5

6 Requirements Accuracy (< 6mm) Robustness to occlusion to drop dust/dirt Cost BOM <= $400 Ease of installation No cables Fast and reliable installation Room size > 30 meters Challenges rapidly changing environment variable lighting conditions many reflective surfaces 11/7/2016 6

7 Existing Solutions Hawk-Eye, which costs $100,000 and can pinpoint a ball to within 5 millimeters. 11/7/ All rights reserved.

8 Why POME? POME = Position & Orientation Measurement Engine Multi User Indoor (+ possibly outdoor) Low cost & fully solid state POME inside == GPS outside Similar weight & volume Similar cost Similar update rate Similar accuracy Like GPS, enables many applications, e.g.: Staked layout Projection systems Robotic systems Augmented Reality POME inside == GPS outside 11/7/ All rights reserved.

9 POME Applications 11/7/ All rights reserved.

10 Principle of Operation Measure angles between known points Use redundant measurements Use least squares to solve a set of non- linear equations Q: How accurately can you measure angles with a (wide angle) camera? Θ Θ A Θ δθ Θ B B Θ A Simplified 2D example Θ A 10 11/7/2016

11 2D example: intersection of 2 circles Nonlinear problem D Intersection of 2 circles gives candidate solutions X & Y Positions of A,B,C,D must be known C Θ CD X Uncertainty in angle measurements results in a covariance ellipse Y Θ AB B A 11/7/ All rights reserved.

12 Error Budget require 0.2 pixel 1 sigma with multiple cameras 1 pixel = 2.2 um Error contributors: Achievable calibration accuracy Lenses mechanics Mechanical stability Centroid determination with: saturated signals weak signals Number and geometry of targets Accuracy of target survey 11/7/

13 System Design Considerations Number of cameras Arrangement of cameras for best F.O.V. Type of image sensors Number / size of pixels Rolling / global shutter Color / monochrome Dynamic range of pixels Type of lens projection function Image sensor matching Type of target LED s Visible / I.R., power, pattern Image processing considerations Image processing bandwidth Power Cost!! Early concept for 3 camera overlapping F.O.V. 11/7/

14 Active Targets Transmit at 850 nm (near IR) Approximately 350 mw Modulated intensity 11/7/

15 Projection from object space to image space Camera: a projection from object space to image space (x,y,z) -> (u,v) 1 to 1 mapping of rays (unit vectors) to image space points (u,v) Point of light becomes a blob (x,y,z) object space (x/z,y/z,1) optical axis θ Z a non-trivial mapping For pose calculation we need to convert from image space points to rays (angles) lens φ O Y X Mapping function is different for every camera => need to calibrate Image sensor (u,v) image space 11/7/

16 Fisheye (f-theta) lens projection For large F.O.V. pinhole camera needs a very large image sensor optical axis optical axis θ θ F-theta projection => equal angle increment maps to equal number of pixels optical center f θ Y optical center f Y Camera is an angle measuring sensor Image sensor R R = f tanθ X Image sensor R R = f θ X pinhole projection f-theta projection 11/7/ All rights reserved.

17 F-theta lens equal angle is mapped to equal distance on image sensor Camera measures angles Our cameras have ~14 pixels/degree => 1 pix = 250 arcsec => 0.2 pix = 50 arcsec ~ 5 20 meters 160 degree Image circle image sensor 11/7/

18 Example blobs using off the shelf lenses Non symmetry of impulse response across the F.O.V Strongly affects centroid determination accuracy Non uniformity of energy distribution across the F.O.V. compounds near/far problem 11/7/

19 Custom Optics DSL627 optimized lens 11/7/ All rights reserved.

20 30 deg. elev. circle 0 deg. elev. circle -30 deg. elev. circle F.O.V. boundary 11/7/ All rights reserved.

21 Some not very interesting images 11/7/ All rights reserved.

22 Example blob 11/7/ All rights reserved.

23 Saturated blob 11/7/ All rights reserved.

24 Calibration residual After removing an f-theta model, we are left with residual Calibrate by fitting a function to the residual inverse residual function Blob centroid (u,v) -> -> inverse f-theta -> angles -> pose solution Lens and camera mechanics must be stable over time and temperature 11/7/ All rights reserved.

25 Shock and Vibration Significant testing has been done to POME head to verify stability Below are test set up and results from shock and vibration testing with positive results 11/7/

26 Lens stability testing results 11/7/ All rights reserved.

27 The Usual Problems Calibration + mechanical / thermal stability range ratio (near / far problem) Registration (target determination) Interference rejection (strong signals) Multipath rejection (with and without direct ray) Initialization Data rate (~3000 Mb/s) / image processing / power Solution (target modulation, synchronization) 11/7/

28 Accuracy Testing -- Warehouse Indoor warehouse location with industrial and natural lighting ~ 15 m x 10 m x 8m PLT for truth validation 11/7/ All rights reserved.

29 Accuracy Testing 11/7/ All rights reserved.

30 Test results for unit 0x position stations 16 azimuth stations at each position station In table on following page Each row is one position station Each column is one azimuth station Numbers show the position error relative to the truth system (PLT) in units of mm Each number represents a static mode result Target locations are shown with black dots and numeric ID Robot trajectory is shown with blue x PLT location is shown with a red dot Position Station TP07 location shown with Warehouse dimensions are in units of meters 11/7/

31 11/7/ All rights reserved.

32 Why is it difficult to state performance? Performance is characterized by error statistics To have significance, statistics need many measurements to validate We have 6 errors to characterize at each point in space 3 components of position error 3 components of orientation error Errors are worse in some directions than in other directions There are a variable number of targets and variable working volume geometry There are different modes of operation (here, we document static and survey modes) We plot the worst direction 1 sigma errors in a square working volume 11/7/

33 Example of scatter plot ellipsoids A scatter plot of results creates a clump of data points distributed around the truth value The statistics of the clump can be characterized by an error ellipsoid Shown below are example 1 sigma ellipsoids for position and orientation Position Error Ellipsoid Orientation Error Ellipsoid 11/7/ All rights reserved.

34 Simulation Target Configurations Nominal square room of size 10x10 meters Consider 6 different target arrangements in room Targets installed at uniform height and positioned on walls around circumference of room Calculate position and orientation accuracy at grid points in the room Calculate solutions with: 1 azimuth station (static mode) 4 azimuth stations (survey mode) All simulations use 0.5 pixel 1 sigma, but we expect/hope to achieve 0.2 pixel 1 sigma in practice!! 11/7/

35 10x10 m room, 8 targets, 1 azim. station 0.5 pix 1 sigma 1 azimuth station 10x10 meter room Worst direction position Worst direction orientation 1 sigma results 11/7/

36 10x10 m room, 8 targets, 4 azim. stations 0.5 pix 1 sigma 4 azimuth stations 10x10 meter room Worst direction position Worst direction orientation 1 sigma results 11/7/

37 The future In the words of Yogi Berra, I never make predictions, especially about the future,' Image sensors and image processing continue to develop quickly reduced cost sophisticated image processing of real images lenses and stability will remain challenges to accuracy Mobile cameras and lightweight infrastructure (scalability, infrastructure) Step 1: Reduce the number of required active targets, use natural features Step 2: sensor fusion with inertial, ranging camera, stereo camera, 11/7/

Creating a distortion characterisation dataset for visual band cameras using fiducial markers.

Creating a distortion characterisation dataset for visual band cameras using fiducial markers. Creating a distortion characterisation dataset for visual band cameras using fiducial markers. Robert Jermy Council for Scientific and Industrial Research Email: rjermy@csir.co.za Jason de Villiers Council

More information

MACHINE VISION AS A METHOD FOR CHARACTERIZING SOLAR TRACKER PERFORMANCE

MACHINE VISION AS A METHOD FOR CHARACTERIZING SOLAR TRACKER PERFORMANCE MACHINE VISION AS A METHOD FOR CHARACTERIZING SOLAR TRACKER PERFORMANCE M. Davis, J. Lawler, J. Coyle, A. Reich, T. Williams GreenMountain Engineering, LLC ABSTRACT This paper describes an approach to

More information

Camera Parameters Estimation from Hand-labelled Sun Sositions in Image Sequences

Camera Parameters Estimation from Hand-labelled Sun Sositions in Image Sequences Camera Parameters Estimation from Hand-labelled Sun Sositions in Image Sequences Jean-François Lalonde, Srinivasa G. Narasimhan and Alexei A. Efros {jlalonde,srinivas,efros}@cs.cmu.edu CMU-RI-TR-8-32 July

More information

Sensor Modalities. Sensor modality: Different modalities:

Sensor Modalities. Sensor modality: Different modalities: Sensor Modalities Sensor modality: Sensors which measure same form of energy and process it in similar ways Modality refers to the raw input used by the sensors Different modalities: Sound Pressure Temperature

More information

Data Association for SLAM

Data Association for SLAM CALIFORNIA INSTITUTE OF TECHNOLOGY ME/CS 132a, Winter 2011 Lab #2 Due: Mar 10th, 2011 Part I Data Association for SLAM 1 Introduction For this part, you will experiment with a simulation of an EKF SLAM

More information

Gregory Walsh, Ph.D. San Ramon, CA January 25, 2011

Gregory Walsh, Ph.D. San Ramon, CA January 25, 2011 Leica ScanStation:: Calibration and QA Gregory Walsh, Ph.D. San Ramon, CA January 25, 2011 1. Summary Leica Geosystems, in creating the Leica Scanstation family of products, has designed and conducted

More information

3D Time-of-Flight Image Sensor Solutions for Mobile Devices

3D Time-of-Flight Image Sensor Solutions for Mobile Devices 3D Time-of-Flight Image Sensor Solutions for Mobile Devices SEMICON Europa 2015 Imaging Conference Bernd Buxbaum 2015 pmdtechnologies gmbh c o n f i d e n t i a l Content Introduction Motivation for 3D

More information

Image Formation. Antonino Furnari. Image Processing Lab Dipartimento di Matematica e Informatica Università degli Studi di Catania

Image Formation. Antonino Furnari. Image Processing Lab Dipartimento di Matematica e Informatica Università degli Studi di Catania Image Formation Antonino Furnari Image Processing Lab Dipartimento di Matematica e Informatica Università degli Studi di Catania furnari@dmi.unict.it 18/03/2014 Outline Introduction; Geometric Primitives

More information

SLAM with SIFT (aka Mobile Robot Localization and Mapping with Uncertainty using Scale-Invariant Visual Landmarks ) Se, Lowe, and Little

SLAM with SIFT (aka Mobile Robot Localization and Mapping with Uncertainty using Scale-Invariant Visual Landmarks ) Se, Lowe, and Little SLAM with SIFT (aka Mobile Robot Localization and Mapping with Uncertainty using Scale-Invariant Visual Landmarks ) Se, Lowe, and Little + Presented by Matt Loper CS296-3: Robot Learning and Autonomy Brown

More information

Camera Model and Calibration

Camera Model and Calibration Camera Model and Calibration Lecture-10 Camera Calibration Determine extrinsic and intrinsic parameters of camera Extrinsic 3D location and orientation of camera Intrinsic Focal length The size of the

More information

CHAPTER 2: THREE DIMENSIONAL TOPOGRAPHICAL MAPPING SYSTEM. Target Object

CHAPTER 2: THREE DIMENSIONAL TOPOGRAPHICAL MAPPING SYSTEM. Target Object CHAPTER 2: THREE DIMENSIONAL TOPOGRAPHICAL MAPPING SYSTEM 2.1 Theory and Construction Target Object Laser Projector CCD Camera Host Computer / Image Processor Figure 2.1 Block Diagram of 3D Areal Mapper

More information

Rigid Body Motion and Image Formation. Jana Kosecka, CS 482

Rigid Body Motion and Image Formation. Jana Kosecka, CS 482 Rigid Body Motion and Image Formation Jana Kosecka, CS 482 A free vector is defined by a pair of points : Coordinates of the vector : 1 3D Rotation of Points Euler angles Rotation Matrices in 3D 3 by 3

More information

New Sony DepthSense TM ToF Technology

New Sony DepthSense TM ToF Technology ADVANCED MATERIAL HANDLING WITH New Sony DepthSense TM ToF Technology Jenson Chang Product Marketing November 7, 2018 1 3D SENSING APPLICATIONS Pick and Place Drones Collision Detection People Counting

More information

New Sony DepthSense TM ToF Technology

New Sony DepthSense TM ToF Technology ADVANCED MATERIAL HANDLING WITH New Sony DepthSense TM ToF Technology Jenson Chang Product Marketing November 7, 2018 1 3D SENSING APPLICATIONS Pick and Place Drones Collision Detection People Counting

More information

Tightly-Integrated Visual and Inertial Navigation for Pinpoint Landing on Rugged Terrains

Tightly-Integrated Visual and Inertial Navigation for Pinpoint Landing on Rugged Terrains Tightly-Integrated Visual and Inertial Navigation for Pinpoint Landing on Rugged Terrains PhD student: Jeff DELAUNE ONERA Director: Guy LE BESNERAIS ONERA Advisors: Jean-Loup FARGES Clément BOURDARIAS

More information

Measurements using three-dimensional product imaging

Measurements using three-dimensional product imaging ARCHIVES of FOUNDRY ENGINEERING Published quarterly as the organ of the Foundry Commission of the Polish Academy of Sciences ISSN (1897-3310) Volume 10 Special Issue 3/2010 41 46 7/3 Measurements using

More information

Prof. Fanny Ficuciello Robotics for Bioengineering Visual Servoing

Prof. Fanny Ficuciello Robotics for Bioengineering Visual Servoing Visual servoing vision allows a robotic system to obtain geometrical and qualitative information on the surrounding environment high level control motion planning (look-and-move visual grasping) low level

More information

Camera Calibration for a Robust Omni-directional Photogrammetry System

Camera Calibration for a Robust Omni-directional Photogrammetry System Camera Calibration for a Robust Omni-directional Photogrammetry System Fuad Khan 1, Michael Chapman 2, Jonathan Li 3 1 Immersive Media Corporation Calgary, Alberta, Canada 2 Ryerson University Toronto,

More information

Omni Stereo Vision of Cooperative Mobile Robots

Omni Stereo Vision of Cooperative Mobile Robots Omni Stereo Vision of Cooperative Mobile Robots Zhigang Zhu*, Jizhong Xiao** *Department of Computer Science **Department of Electrical Engineering The City College of the City University of New York (CUNY)

More information

Binoculars. with. Digital Compass. Instruction Manual. Model: Lit. #: /08-12

Binoculars. with. Digital Compass. Instruction Manual. Model: Lit. #: /08-12 Binoculars with Digital Compass Model: 137570 Instruction Manual Lit. #: 98-1192/08-12 Right Eyepiece Focus Compass Power Switch Battery Cover Left Eyepiece Focus Parts Reference Tripod Attachment Socket

More information

Stable Vision-Aided Navigation for Large-Area Augmented Reality

Stable Vision-Aided Navigation for Large-Area Augmented Reality Stable Vision-Aided Navigation for Large-Area Augmented Reality Taragay Oskiper, Han-Pang Chiu, Zhiwei Zhu Supun Samarasekera, Rakesh Teddy Kumar Vision and Robotics Laboratory SRI-International Sarnoff,

More information

CV: 3D sensing and calibration

CV: 3D sensing and calibration CV: 3D sensing and calibration Coordinate system changes; perspective transformation; Stereo and structured light MSU CSE 803 1 roadmap using multiple cameras using structured light projector 3D transformations

More information

Robust Horizontal Line Detection and Tracking in Occluded Environment for Infrared Cameras

Robust Horizontal Line Detection and Tracking in Occluded Environment for Infrared Cameras Robust Horizontal Line Detection and Tracking in Occluded Environment for Infrared Cameras Sungho Kim 1, Soon Kwon 2, and Byungin Choi 3 1 LED-IT Fusion Technology Research Center and Department of Electronic

More information

Camera Projection Models We will introduce different camera projection models that relate the location of an image point to the coordinates of the

Camera Projection Models We will introduce different camera projection models that relate the location of an image point to the coordinates of the Camera Projection Models We will introduce different camera projection models that relate the location of an image point to the coordinates of the corresponding 3D points. The projection models include:

More information

ADVANCED SURVEYING (S-II)

ADVANCED SURVEYING (S-II) ADVANCED SURVEYING (S-II) (LAB) KISHANGANJ COLLEGE OF ENGINEERING AND TECHNOLOGY, KISHANGANJ. Experiment No- 1 ADVANCED SURVEYING (S-II) LAB Aim: Determination of the Multiplying and additive constant

More information

(a) (b) (c) Fig. 1. Omnidirectional camera: (a) principle; (b) physical construction; (c) captured. of a local vision system is more challenging than

(a) (b) (c) Fig. 1. Omnidirectional camera: (a) principle; (b) physical construction; (c) captured. of a local vision system is more challenging than An Omnidirectional Vision System that finds and tracks color edges and blobs Felix v. Hundelshausen, Sven Behnke, and Raul Rojas Freie Universität Berlin, Institut für Informatik Takustr. 9, 14195 Berlin,

More information

Mode-Field Diameter and Spot Size Measurements of Lensed and Tapered Specialty Fibers

Mode-Field Diameter and Spot Size Measurements of Lensed and Tapered Specialty Fibers Mode-Field Diameter and Spot Size Measurements of Lensed and Tapered Specialty Fibers By Jeffrey L. Guttman, Ph.D., Director of Engineering, Ophir-Spiricon Abstract: The Mode-Field Diameter (MFD) and spot

More information

Camera Model and Calibration. Lecture-12

Camera Model and Calibration. Lecture-12 Camera Model and Calibration Lecture-12 Camera Calibration Determine extrinsic and intrinsic parameters of camera Extrinsic 3D location and orientation of camera Intrinsic Focal length The size of the

More information

Realistic Camera Model

Realistic Camera Model Realistic Camera Model Shan-Yung Yang November 2, 2006 Shan-Yung Yang () Realistic Camera Model November 2, 2006 1 / 25 Outline Introduction Lens system Thick lens approximation Radiometry Sampling Assignment

More information

Advanced Vision Guided Robotics. David Bruce Engineering Manager FANUC America Corporation

Advanced Vision Guided Robotics. David Bruce Engineering Manager FANUC America Corporation Advanced Vision Guided Robotics David Bruce Engineering Manager FANUC America Corporation Traditional Vision vs. Vision based Robot Guidance Traditional Machine Vision Determine if a product passes or

More information

Announcements. Lighting. Camera s sensor. HW1 has been posted See links on web page for readings on color. Intro Computer Vision.

Announcements. Lighting. Camera s sensor. HW1 has been posted See links on web page for readings on color. Intro Computer Vision. Announcements HW1 has been posted See links on web page for readings on color. Introduction to Computer Vision CSE 152 Lecture 6 Deviations from the lens model Deviations from this ideal are aberrations

More information

Chapters 1 9: Overview

Chapters 1 9: Overview Chapters 1 9: Overview Chapter 1: Introduction Chapters 2 4: Data acquisition Chapters 5 9: Data manipulation Chapter 5: Vertical imagery Chapter 6: Image coordinate measurements and refinements Chapters

More information

cse 252c Fall 2004 Project Report: A Model of Perpendicular Texture for Determining Surface Geometry

cse 252c Fall 2004 Project Report: A Model of Perpendicular Texture for Determining Surface Geometry cse 252c Fall 2004 Project Report: A Model of Perpendicular Texture for Determining Surface Geometry Steven Scher December 2, 2004 Steven Scher SteveScher@alumni.princeton.edu Abstract Three-dimensional

More information

Exterior Orientation Parameters

Exterior Orientation Parameters Exterior Orientation Parameters PERS 12/2001 pp 1321-1332 Karsten Jacobsen, Institute for Photogrammetry and GeoInformation, University of Hannover, Germany The georeference of any photogrammetric product

More information

Flexible Calibration of a Portable Structured Light System through Surface Plane

Flexible Calibration of a Portable Structured Light System through Surface Plane Vol. 34, No. 11 ACTA AUTOMATICA SINICA November, 2008 Flexible Calibration of a Portable Structured Light System through Surface Plane GAO Wei 1 WANG Liang 1 HU Zhan-Yi 1 Abstract For a portable structured

More information

CS4758: Rovio Augmented Vision Mapping Project

CS4758: Rovio Augmented Vision Mapping Project CS4758: Rovio Augmented Vision Mapping Project Sam Fladung, James Mwaura Abstract The goal of this project is to use the Rovio to create a 2D map of its environment using a camera and a fixed laser pointer

More information

Applications of Piezo Actuators for Space Instrument Optical Alignment

Applications of Piezo Actuators for Space Instrument Optical Alignment Year 4 University of Birmingham Presentation Applications of Piezo Actuators for Space Instrument Optical Alignment Michelle Louise Antonik 520689 Supervisor: Prof. B. Swinyard Outline of Presentation

More information

Camera Calibration for Video See-Through Head-Mounted Display. Abstract. 1.0 Introduction. Mike Bajura July 7, 1993

Camera Calibration for Video See-Through Head-Mounted Display. Abstract. 1.0 Introduction. Mike Bajura July 7, 1993 Camera Calibration for Video See-Through Head-Mounted Display Mike Bajura July 7, 1993 Abstract This report describes a method for computing the parameters needed to model a television camera for video

More information

Mobile Robotics. Mathematics, Models, and Methods. HI Cambridge. Alonzo Kelly. Carnegie Mellon University UNIVERSITY PRESS

Mobile Robotics. Mathematics, Models, and Methods. HI Cambridge. Alonzo Kelly. Carnegie Mellon University UNIVERSITY PRESS Mobile Robotics Mathematics, Models, and Methods Alonzo Kelly Carnegie Mellon University HI Cambridge UNIVERSITY PRESS Contents Preface page xiii 1 Introduction 1 1.1 Applications of Mobile Robots 2 1.2

More information

Human Detection. A state-of-the-art survey. Mohammad Dorgham. University of Hamburg

Human Detection. A state-of-the-art survey. Mohammad Dorgham. University of Hamburg Human Detection A state-of-the-art survey Mohammad Dorgham University of Hamburg Presentation outline Motivation Applications Overview of approaches (categorized) Approaches details References Motivation

More information

: Easy 3D Calibration of laser triangulation systems. Fredrik Nilsson Product Manager, SICK, BU Vision

: Easy 3D Calibration of laser triangulation systems. Fredrik Nilsson Product Manager, SICK, BU Vision : Easy 3D Calibration of laser triangulation systems Fredrik Nilsson Product Manager, SICK, BU Vision Using 3D for Machine Vision solutions : 3D imaging is becoming more important and well accepted for

More information

Computer Vision. Coordinates. Prof. Flávio Cardeal DECOM / CEFET- MG.

Computer Vision. Coordinates. Prof. Flávio Cardeal DECOM / CEFET- MG. Computer Vision Coordinates Prof. Flávio Cardeal DECOM / CEFET- MG cardeal@decom.cefetmg.br Abstract This lecture discusses world coordinates and homogeneous coordinates, as well as provides an overview

More information

Characterizing Strategies of Fixing Full Scale Models in Construction Photogrammetric Surveying. Ryan Hough and Fei Dai

Characterizing Strategies of Fixing Full Scale Models in Construction Photogrammetric Surveying. Ryan Hough and Fei Dai 697 Characterizing Strategies of Fixing Full Scale Models in Construction Photogrammetric Surveying Ryan Hough and Fei Dai West Virginia University, Department of Civil and Environmental Engineering, P.O.

More information

Mech 296: Vision for Robotic Applications. Today s Summary

Mech 296: Vision for Robotic Applications. Today s Summary Mech 296: Vision for Robotic Applications Gravity Probe B http://einstein.stanford.edu/content/pict_gal/main_index.html Lecture 3: Visual Sensing 3.1 Today s Summary 1. Visual Signal: Position in Time

More information

Distributed Vision-Aided Cooperative Navigation Based on Three-View Geometry

Distributed Vision-Aided Cooperative Navigation Based on Three-View Geometry Distributed Vision-Aided Cooperative Navigation Based on hree-view Geometry Vadim Indelman, Pini Gurfil Distributed Space Systems Lab, Aerospace Engineering, echnion Ehud Rivlin Computer Science, echnion

More information

Centre for Digital Image Measurement and Analysis, School of Engineering, City University, Northampton Square, London, ECIV OHB

Centre for Digital Image Measurement and Analysis, School of Engineering, City University, Northampton Square, London, ECIV OHB HIGH ACCURACY 3-D MEASUREMENT USING MULTIPLE CAMERA VIEWS T.A. Clarke, T.J. Ellis, & S. Robson. High accuracy measurement of industrially produced objects is becoming increasingly important. The techniques

More information

A Fast Linear Registration Framework for Multi-Camera GIS Coordination

A Fast Linear Registration Framework for Multi-Camera GIS Coordination A Fast Linear Registration Framework for Multi-Camera GIS Coordination Karthik Sankaranarayanan James W. Davis Dept. of Computer Science and Engineering Ohio State University Columbus, OH 4320 USA {sankaran,jwdavis}@cse.ohio-state.edu

More information

Lecture: Autonomous micro aerial vehicles

Lecture: Autonomous micro aerial vehicles Lecture: Autonomous micro aerial vehicles Friedrich Fraundorfer Remote Sensing Technology TU München 1/41 Autonomous operation@eth Zürich Start 2/41 Autonomous operation@eth Zürich 3/41 Outline MAV system

More information

Understanding Variability

Understanding Variability Understanding Variability Why so different? Light and Optics Pinhole camera model Perspective projection Thin lens model Fundamental equation Distortion: spherical & chromatic aberration, radial distortion

More information

Chapters 1-4: Summary

Chapters 1-4: Summary Chapters 1-4: Summary So far, we have been investigating the image acquisition process. Chapter 1: General introduction Chapter 2: Radiation source and properties Chapter 3: Radiation interaction with

More information

Using infrared proximity sensors for close 2D localization and object size recognition. Richard Berglind Neonode

Using infrared proximity sensors for close 2D localization and object size recognition. Richard Berglind Neonode Using infrared proximity sensors for close 2D localization and object size recognition Richard Berglind Neonode Outline Overview of sensor types IR proximity sensors and their drawbacks Principles of a

More information

TERRESTRIAL AND NUMERICAL PHOTOGRAMMETRY 1. MID -TERM EXAM Question 4

TERRESTRIAL AND NUMERICAL PHOTOGRAMMETRY 1. MID -TERM EXAM Question 4 TERRESTRIAL AND NUMERICAL PHOTOGRAMMETRY 1. MID -TERM EXAM Question 4 23 November 2001 Two-camera stations are located at the ends of a base, which are 191.46m long, measured horizontally. Photographs

More information

Gravity Deformation Measurements of NASA s Deep Space Network 70-Meter Reflector Antennas

Gravity Deformation Measurements of NASA s Deep Space Network 70-Meter Reflector Antennas IPN Progress Report 42-147 November 15, 2001 Gravity Deformation Measurements of NASA s Deep Space Network 70-Meter Reflector Antennas W. A. Imbriale, 1 M. J. Britcliffe, 1 and M. Brenner 2 All of NASA

More information

ARCHAEOLOGICAL 3D MAPPING USING A LOW-COST POSITIONING SYSTEM BASED ON ACOUSTIC WAVES

ARCHAEOLOGICAL 3D MAPPING USING A LOW-COST POSITIONING SYSTEM BASED ON ACOUSTIC WAVES ARCHAEOLOGICAL 3D MAPPING USING A LOW-COST POSITIONING SYSTEM BASED ON ACOUSTIC WAVES P. Guidorzi a, E. Vecchietti b, M. Garai a a Department of Industrial Engineering (DIN), Viale Risorgimento 2, 40136

More information

Alignment Services LLC

Alignment Services LLC Alignment Services LLC Accuracy and Portability Laser Tracker IIPLUS!TM Laser Tracker II PLUS! surpasses the performance, portability, and accuracy of previous tracker technology. Laser Tracker IIPLUS!

More information

Jump Stitch Metadata & Depth Maps Version 1.1

Jump Stitch Metadata & Depth Maps Version 1.1 Jump Stitch Metadata & Depth Maps Version 1.1 jump-help@google.com Contents 1. Introduction 1 2. Stitch Metadata File Format 2 3. Coverage Near the Poles 4 4. Coordinate Systems 6 5. Camera Model 6 6.

More information

Minimizing Noise and Bias in 3D DIC. Correlated Solutions, Inc.

Minimizing Noise and Bias in 3D DIC. Correlated Solutions, Inc. Minimizing Noise and Bias in 3D DIC Correlated Solutions, Inc. Overview Overview of Noise and Bias Digital Image Correlation Background/Tracking Function Minimizing Noise Focus Contrast/Lighting Glare

More information

An Image-Based Three-Dimensional Digitizer for Pre-Decorating Thermoformed Parts

An Image-Based Three-Dimensional Digitizer for Pre-Decorating Thermoformed Parts An Image-Based Three-Dimensional Digitizer for Pre-Decorating Thermoformed Parts J.P. Mellor Rose-Hulman Institute of Technology jpmellor@rose-hulman.edu Abstract Thermoformed plastic parts are pervasive

More information

PMD [vision] Day Vol. 3 Munich, November 18, PMD Cameras for Automotive & Outdoor Applications. ifm electronic gmbh, V.Frey. Dr.

PMD [vision] Day Vol. 3 Munich, November 18, PMD Cameras for Automotive & Outdoor Applications. ifm electronic gmbh, V.Frey. Dr. R PMD [vision] Day Vol. 3 Munich, November 18, 2010 Dr. Volker Frey ifm electronic gmbh PMD Cameras for Automotive & Outdoor Applications Stand: 27.10.2010 Seite 1 I Working Principle PMD distance measurement

More information

Gaze interaction (2): models and technologies

Gaze interaction (2): models and technologies Gaze interaction (2): models and technologies Corso di Interazione uomo-macchina II Prof. Giuseppe Boccignone Dipartimento di Scienze dell Informazione Università di Milano boccignone@dsi.unimi.it http://homes.dsi.unimi.it/~boccignone/l

More information

Colorado School of Mines. Computer Vision. Professor William Hoff Dept of Electrical Engineering &Computer Science.

Colorado School of Mines. Computer Vision. Professor William Hoff Dept of Electrical Engineering &Computer Science. Professor William Hoff Dept of Electrical Engineering &Computer Science http://inside.mines.edu/~whoff/ 1 Bundle Adjustment 2 Example Application A vehicle needs to map its environment that it is moving

More information

Introduction to Computer Vision

Introduction to Computer Vision Introduction to Computer Vision Michael J. Black Nov 2009 Perspective projection and affine motion Goals Today Perspective projection 3D motion Wed Projects Friday Regularization and robust statistics

More information

Capturing light. Source: A. Efros

Capturing light. Source: A. Efros Capturing light Source: A. Efros Review Pinhole projection models What are vanishing points and vanishing lines? What is orthographic projection? How can we approximate orthographic projection? Lenses

More information

NAVIGATION AND ELECTRO-OPTIC SENSOR INTEGRATION TECHNOLOGY FOR FUSION OF IMAGERY AND DIGITAL MAPPING PRODUCTS. Alison Brown, NAVSYS Corporation

NAVIGATION AND ELECTRO-OPTIC SENSOR INTEGRATION TECHNOLOGY FOR FUSION OF IMAGERY AND DIGITAL MAPPING PRODUCTS. Alison Brown, NAVSYS Corporation NAVIGATION AND ELECTRO-OPTIC SENSOR INTEGRATION TECHNOLOGY FOR FUSION OF IMAGERY AND DIGITAL MAPPING PRODUCTS Alison Brown, NAVSYS Corporation Paul Olson, CECOM Abstract Several military and commercial

More information

Personal Navigation and Indoor Mapping: Performance Characterization of Kinect Sensor-based Trajectory Recovery

Personal Navigation and Indoor Mapping: Performance Characterization of Kinect Sensor-based Trajectory Recovery Personal Navigation and Indoor Mapping: Performance Characterization of Kinect Sensor-based Trajectory Recovery 1 Charles TOTH, 1 Dorota BRZEZINSKA, USA 2 Allison KEALY, Australia, 3 Guenther RETSCHER,

More information

What is it? How does it work? How do we use it?

What is it? How does it work? How do we use it? What is it? How does it work? How do we use it? Dual Nature http://www.youtube.com/watch?v=dfpeprq7ogc o Electromagnetic Waves display wave behavior o Created by oscillating electric and magnetic fields

More information

Mathematics in Orbit

Mathematics in Orbit Mathematics in Orbit Dan Kalman American University Slides and refs at www.dankalman.net Outline Basics: 3D geospacial models Keyhole Problem: Related Rates! GPS: space-time triangulation Sensor Diagnosis:

More information

Time-of-flight basics

Time-of-flight basics Contents 1. Introduction... 2 2. Glossary of Terms... 3 3. Recovering phase from cross-correlation... 4 4. Time-of-flight operating principle: the lock-in amplifier... 6 5. The time-of-flight sensor pixel...

More information

Accurate 3D Face and Body Modeling from a Single Fixed Kinect

Accurate 3D Face and Body Modeling from a Single Fixed Kinect Accurate 3D Face and Body Modeling from a Single Fixed Kinect Ruizhe Wang*, Matthias Hernandez*, Jongmoo Choi, Gérard Medioni Computer Vision Lab, IRIS University of Southern California Abstract In this

More information

Sensor Fusion: Potential, Challenges and Applications. Presented by KVH Industries and Geodetics, Inc. December 2016

Sensor Fusion: Potential, Challenges and Applications. Presented by KVH Industries and Geodetics, Inc. December 2016 Sensor Fusion: Potential, Challenges and Applications Presented by KVH Industries and Geodetics, Inc. December 2016 1 KVH Industries Overview Innovative technology company 600 employees worldwide Focused

More information

P h a s e O n e i X U - RS A c c u r a c y A n a l y s i s. T h e f o r e f r o n t o f a e r i a l p h o t o g r a p h y

P h a s e O n e i X U - RS A c c u r a c y A n a l y s i s. T h e f o r e f r o n t o f a e r i a l p h o t o g r a p h y P h a s e O n e i X U - RS1 0 0 0 A c c u r a c y A n a l y s i s T h e f o r e f r o n t o f a e r i a l p h o t o g r a p h y 1 Phase One Industrial Aerial Survey Products ixu-rs1000, ixu1000 series

More information

Fundamentals of Stereo Vision Michael Bleyer LVA Stereo Vision

Fundamentals of Stereo Vision Michael Bleyer LVA Stereo Vision Fundamentals of Stereo Vision Michael Bleyer LVA Stereo Vision What Happened Last Time? Human 3D perception (3D cinema) Computational stereo Intuitive explanation of what is meant by disparity Stereo matching

More information

Motion Analysis Methods. Gerald Smith

Motion Analysis Methods. Gerald Smith Motion Analysis Methods Gerald Smith Measurement Activity: How accurately can the diameter of a golf ball and a koosh ball be measured? Diameter? 1 What is the diameter of a golf ball? What errors are

More information

UAV Position and Attitude Sensoring in Indoor Environment Using Cameras

UAV Position and Attitude Sensoring in Indoor Environment Using Cameras UAV Position and Attitude Sensoring in Indoor Environment Using Cameras 1 Peng Xu Abstract There are great advantages of indoor experiment for UAVs. Test flights of UAV in laboratory is more convenient,

More information

Development of 3D Positioning Scheme by Integration of Multiple Wiimote IR Cameras

Development of 3D Positioning Scheme by Integration of Multiple Wiimote IR Cameras Proceedings of the 5th IIAE International Conference on Industrial Application Engineering 2017 Development of 3D Positioning Scheme by Integration of Multiple Wiimote IR Cameras Hui-Yuan Chan *, Ting-Hao

More information

Autonomous Landing of an Unmanned Aerial Vehicle

Autonomous Landing of an Unmanned Aerial Vehicle Autonomous Landing of an Unmanned Aerial Vehicle Joel Hermansson, Andreas Gising Cybaero AB SE-581 12 Linköping, Sweden Email: {joel.hermansson, andreas.gising}@cybaero.se Martin Skoglund and Thomas B.

More information

Towards a visual perception system for LNG pipe inspection

Towards a visual perception system for LNG pipe inspection Towards a visual perception system for LNG pipe inspection LPV Project Team: Brett Browning (PI), Peter Rander (co PI), Peter Hansen Hatem Alismail, Mohamed Mustafa, Joey Gannon Qri8 Lab A Brief Overview

More information

2D MANUAL. is a manual 2D vision system with a massive difference.

2D MANUAL. is a manual 2D vision system with a massive difference. vision systems 2D MANUAL is a manual 2D vision system with a massive difference. VuMaster Due to the newly patented Colourmap scale system, the VuMaster does not have a conventional stage or encoders just

More information

A Global Laser Brand. Versatile Module

A Global Laser Brand. Versatile Module A Global Laser Brand Versatile Module The Versatile Module The VM (Versatile Module) from Imatronic provides a high quality and cost effect OEM solution to a wide range of applications including Machine

More information

Stereo SLAM. Davide Migliore, PhD Department of Electronics and Information, Politecnico di Milano, Italy

Stereo SLAM. Davide Migliore, PhD Department of Electronics and Information, Politecnico di Milano, Italy Stereo SLAM, PhD migliore@elet.polimi.it Department of Electronics and Information, Politecnico di Milano, Italy What is a Stereo Camera? Slide n 2 Do you remember the pin-hole camera? What is a Stereo

More information

ADS40 Calibration & Verification Process. Udo Tempelmann*, Ludger Hinsken**, Utz Recke*

ADS40 Calibration & Verification Process. Udo Tempelmann*, Ludger Hinsken**, Utz Recke* ADS40 Calibration & Verification Process Udo Tempelmann*, Ludger Hinsken**, Utz Recke* *Leica Geosystems GIS & Mapping GmbH, Switzerland **Ludger Hinsken, Author of ORIMA, Konstanz, Germany Keywords: ADS40,

More information

Thread Mountable Cameo Laser Diode Module

Thread Mountable Cameo Laser Diode Module Thread Mountable Cameo Laser Diode Module Thread Mountable Cameo The Cameo is a unique, versatile, high quality industrial laser diode module widely used in alignment applications. Available in two models,

More information

Stereo Observation Models

Stereo Observation Models Stereo Observation Models Gabe Sibley June 16, 2003 Abstract This technical report describes general stereo vision triangulation and linearized error modeling. 0.1 Standard Model Equations If the relative

More information

Using constraint propagation for cooperative UAV localization from vision and ranging

Using constraint propagation for cooperative UAV localization from vision and ranging Using constraint propagation for cooperative UAV localization from vision and ranging Ide-Flore Kenmogne, Vincent Drevelle, and Eric Marchand Univ Rennes, Inria, CNRS, IRISA ide-flore.kenmogne@inria.fr,

More information

Camera Calibration. Schedule. Jesus J Caban. Note: You have until next Monday to let me know. ! Today:! Camera calibration

Camera Calibration. Schedule. Jesus J Caban. Note: You have until next Monday to let me know. ! Today:! Camera calibration Camera Calibration Jesus J Caban Schedule! Today:! Camera calibration! Wednesday:! Lecture: Motion & Optical Flow! Monday:! Lecture: Medical Imaging! Final presentations:! Nov 29 th : W. Griffin! Dec 1

More information

Sensor Integration and Image Georeferencing for Airborne 3D Mapping Applications

Sensor Integration and Image Georeferencing for Airborne 3D Mapping Applications Sensor Integration and Image Georeferencing for Airborne 3D Mapping Applications By Sameh Nassar and Naser El-Sheimy University of Calgary, Canada Contents Background INS/GPS Integration & Direct Georeferencing

More information

Table of Contents 1 PURPOSE SCOPE DEFINITIONS PROCEDURE... 5

Table of Contents 1 PURPOSE SCOPE DEFINITIONS PROCEDURE... 5 Table of Contents 1 PURPOSE... 3 2 SCOPE... 3 3 DEFINITIONS... 4 4 PROCEDURE... 5 4.1 Overview - Performing a Site Calibration... 5 4.1.1 Upload Mine Grid Control... 6 4.1.2 Obtain SSM Data... 7 4.1.3

More information

Cameras and Radiometry. Last lecture in a nutshell. Conversion Euclidean -> Homogenous -> Euclidean. Affine Camera Model. Simplified Camera Models

Cameras and Radiometry. Last lecture in a nutshell. Conversion Euclidean -> Homogenous -> Euclidean. Affine Camera Model. Simplified Camera Models Cameras and Radiometry Last lecture in a nutshell CSE 252A Lecture 5 Conversion Euclidean -> Homogenous -> Euclidean In 2-D Euclidean -> Homogenous: (x, y) -> k (x,y,1) Homogenous -> Euclidean: (x, y,

More information

= 3 + (5*4) + (1/2)*(4/2)^2.

= 3 + (5*4) + (1/2)*(4/2)^2. Physics 100 Lab 1: Use of a Spreadsheet to Analyze Data by Kenneth Hahn and Michael Goggin In this lab you will learn how to enter data into a spreadsheet and to manipulate the data in meaningful ways.

More information

Autonomous Navigation for Flying Robots

Autonomous Navigation for Flying Robots Computer Vision Group Prof. Daniel Cremers Autonomous Navigation for Flying Robots Lecture 3.2: Sensors Jürgen Sturm Technische Universität München Sensors IMUs (inertial measurement units) Accelerometers

More information

DPS900 SOFTWARE QUICK REFERENCE CARD FOR PILING OPERATORS

DPS900 SOFTWARE QUICK REFERENCE CARD FOR PILING OPERATORS DPS900 SOFTWARE QUICK REFERENCE CARD FOR PILING OPERATORS This document contains information for piling operators on how to use the Trimble DPS900 software. For information for supervisors, please refer

More information

Calibrating an Overhead Video Camera

Calibrating an Overhead Video Camera Calibrating an Overhead Video Camera Raul Rojas Freie Universität Berlin, Takustraße 9, 495 Berlin, Germany http://www.fu-fighters.de Abstract. In this section we discuss how to calibrate an overhead video

More information

Optimized Design of 3D Laser Triangulation Systems

Optimized Design of 3D Laser Triangulation Systems The Scan Principle of 3D Laser Triangulation Triangulation Geometry Example of Setup Z Y X Target as seen from the Camera Sensor Image of Laser Line The Scan Principle of 3D Laser Triangulation Detektion

More information

TRAINING MATERIAL HOW TO OPTIMIZE ACCURACY WITH CORRELATOR3D

TRAINING MATERIAL HOW TO OPTIMIZE ACCURACY WITH CORRELATOR3D TRAINING MATERIAL WITH CORRELATOR3D Page2 Contents 1. UNDERSTANDING INPUT DATA REQUIREMENTS... 4 1.1 What is Aerial Triangulation?... 4 1.2 Recommended Flight Configuration... 4 1.3 Data Requirements for

More information

FLEXPOINT Laser Modules. Laser Modules For Your Application. laser-components.com

FLEXPOINT Laser Modules. Laser Modules For Your Application. laser-components.com FLEXPOINT Laser Modules Laser Modules For Your Application laser-components.com Get in Contact LASER COMPONENTS USA, Inc. 116 South River Road, Building C Bedford, NH 03110 / USA www.laser-components.com

More information

3GSM GmbH. Plüddemanngasse 77 A-8010 Graz, Austria Tel Fax:

3GSM GmbH. Plüddemanngasse 77 A-8010 Graz, Austria Tel Fax: White Paper Graz, April 2014 3GSM GmbH Plüddemanngasse 77 A-8010 Graz, Austria Tel. +43-316-464744 Fax: +43-316-464744-11 office@3gsm.at www.3gsm.at Measurement and assessment of rock and terrain surfaces

More information

Servosila Robotic Heads

Servosila Robotic Heads Servosila Robotic Heads www.servosila.com TABLE OF CONTENTS SERVOSILA ROBOTIC HEADS 2 SOFTWARE-DEFINED FUNCTIONS OF THE ROBOTIC HEADS 2 SPECIFICATIONS: ROBOTIC HEADS 4 DIMENSIONS OF ROBOTIC HEAD 5 DIMENSIONS

More information

Robust and Accurate Detection of Object Orientation and ID without Color Segmentation

Robust and Accurate Detection of Object Orientation and ID without Color Segmentation 0 Robust and Accurate Detection of Object Orientation and ID without Color Segmentation Hironobu Fujiyoshi, Tomoyuki Nagahashi and Shoichi Shimizu Chubu University Japan Open Access Database www.i-techonline.com

More information

Multi-View Omni-Directional Imaging

Multi-View Omni-Directional Imaging Multi-View Omni-Directional Imaging Tuesday, December 19, 2000 Moshe Ben-Ezra, Shmuel Peleg Abstract This paper describes a novel camera design or the creation o multiple panoramic images, such that each

More information

Project Goals: Particle size distribution with shape information (roundness, aspect ratio) with highest precision and accuracy.

Project Goals: Particle size distribution with shape information (roundness, aspect ratio) with highest precision and accuracy. CANTY PROCESS TECHNOLOGY 60 Donner Road Lockport, NY 14094 Phone: (716) 62-4227 Fax: (716) 62-4228 e-mail: sales@jmcanty.com Lab Test Report Sample Identity- Granular material described as: Purpose: SAM1

More information