The main problem of photogrammetry

Similar documents
Structured Light II. Guido Gerig CS 6320, Spring (thanks: slides Prof. S. Narasimhan, CMU, Marc Pollefeys, UNC)

3D Computer Vision. Structured Light I. Prof. Didier Stricker. Kaiserlautern University.

L2 Data Acquisition. Mechanical measurement (CMM) Structured light Range images Shape from shading Other methods

Active Stereo Vision. COMP 4900D Winter 2012 Gerhard Roth

Structured Light. Tobias Nöll Thanks to Marc Pollefeys, David Nister and David Lowe

3D Scanning. Qixing Huang Feb. 9 th Slide Credit: Yasutaka Furukawa

Depth Sensors Kinect V2 A. Fornaser

Computer Vision. 3D acquisition

Introduction to 3D Machine Vision

There are many cues in monocular vision which suggests that vision in stereo starts very early from two similar 2D images. Lets see a few...

Structured light 3D reconstruction

Outline. ETN-FPI Training School on Plenoptic Sensing

Lecture 8 Active stereo & Volumetric stereo

Multiple View Geometry

Multiple View Geometry

3D Photography: Active Ranging, Structured Light, ICP

BIL Computer Vision Apr 16, 2014

Range Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation

Time-of-flight basics

3D Computer Vision. Depth Cameras. Prof. Didier Stricker. Oliver Wasenmüller

Structured light , , Computational Photography Fall 2017, Lecture 27

Dense 3D Reconstruction. Christiano Gava

Stereo and structured light

Dense 3-D surface acquisition by structured light using off-the-shelf components

3D Photography: Stereo

Depth. Common Classification Tasks. Example: AlexNet. Another Example: Inception. Another Example: Inception. Depth

Today. Stereo (two view) reconstruction. Multiview geometry. Today. Multiview geometry. Computational Photography

Epipolar geometry contd.

3D Computer Vision 1

Dense 3D Reconstruction. Christiano Gava

Project 2 due today Project 3 out today. Readings Szeliski, Chapter 10 (through 10.5)

Lecture 17: Recursive Ray Tracing. Where is the way where light dwelleth? Job 38:19

AUTOMATED CALIBRATION TECHNIQUE FOR PHOTOGRAMMETRIC SYSTEM BASED ON A MULTI-MEDIA PROJECTOR AND A CCD CAMERA

3D data merging using Holoimage

12/3/2009. What is Computer Vision? Applications. Application: Assisted driving Pedestrian and car detection. Application: Improving online search

Information page for written examinations at Linköping University TER2

Range Sensors (time of flight) (1)

CONTENTS. High-Accuracy Stereo Depth Maps Using Structured Light. Yeojin Yoon

calibrated coordinates Linear transformation pixel coordinates

Advanced Vision Guided Robotics. David Bruce Engineering Manager FANUC America Corporation

3D Sensing and Reconstruction Readings: Ch 12: , Ch 13: ,

Unit 3 Multiple View Geometry

Lecture 19: Depth Cameras. Visual Computing Systems CMU , Fall 2013

Experiment 1: Diffraction from a Single Slit

SL A Tordivel - Thor Vollset -Stereo Vision and structured illumination creates dense 3D Images Page 1

Metrology and Sensing

Flexible Calibration of a Portable Structured Light System through Surface Plane

Computer Vision cmput 428/615

Image Guided Phase Unwrapping for Real Time 3D Scanning

ENGN D Photography / Spring 2018 / SYLLABUS

Project 3 code & artifact due Tuesday Final project proposals due noon Wed (by ) Readings Szeliski, Chapter 10 (through 10.5)

Stereo. 11/02/2012 CS129, Brown James Hays. Slides by Kristen Grauman

3D Scanning. Lecture courtesy of Szymon Rusinkiewicz Princeton University

Agenda. DLP 3D scanning Introduction DLP 3D scanning SDK Introduction Advance features for existing SDK

Enhanced two-frequency phase-shifting method

Interference. Electric fields from two different sources at a single location add together. The same is true for magnetic fields at a single location.

EECS 442 Computer vision. Stereo systems. Stereo vision Rectification Correspondence problem Active stereo vision systems

Computer Science 426 Midterm 3/11/04, 1:30PM-2:50PM

Structure from Motion. Prof. Marco Marcon

Robust and Accurate One-shot 3D Reconstruction by 2C1P System with Wave Grid Pattern

3D scanning. 3D scanning is a family of technologies created as a means of automatic measurement of geometric properties of objects.

Rectification and Disparity

Traditional pipeline for shape capture. Active range scanning. New pipeline for shape capture. Draw contours on model. Digitize lines with touch probe

Sensing Deforming and Moving Objects with Commercial Off the Shelf Hardware

Binocular stereo. Given a calibrated binocular stereo pair, fuse it to produce a depth image. Where does the depth information come from?

Optical Active 3D Scanning. Gianpaolo Palma

Stereo vision. Many slides adapted from Steve Seitz

Multi-View Stereo for Community Photo Collections Michael Goesele, et al, ICCV Venus de Milo

Computer Vision. Coordinates. Prof. Flávio Cardeal DECOM / CEFET- MG.

10/5/09 1. d = 2. Range Sensors (time of flight) (2) Ultrasonic Sensor (time of flight, sound) (1) Ultrasonic Sensor (time of flight, sound) (2) 4.1.

PHYSICS 1040L LAB LAB 7: DIFFRACTION & INTERFERENCE

Computer Vision I - Algorithms and Applications: Multi-View 3D reconstruction

SM 2. Date: Section: Objective: The Pythagorean Theorem: In a triangle, or

Stereo Wrap + Motion. Computer Vision I. CSE252A Lecture 17

Embedded Phase Shifting: Robust Phase Shifting with Embedded Signals

Stereo Vision. MAN-522 Computer Vision

Precise Dynamic Measurement of Structures Automatically Utilizing Adaptive Targeting

COMP30019 Graphics and Interaction Ray Tracing

Recap from Previous Lecture

Handheld Augmented Reality. Reto Lindegger

Chapter 37. Wave Optics

Unit 2: Trigonometry. This lesson is not covered in your workbook. It is a review of trigonometry topics from previous courses.

Tecnologie per la ricostruzione di modelli 3D da immagini. Marco Callieri ISTI-CNR, Pisa, Italy

Design and Calibration of a Network of RGB-D Sensors for Robotic Applications over Large Workspaces

UNIT 102-9: INTERFERENCE AND DIFFRACTION

CS5670: Computer Vision

3D Object Representations. COS 526, Fall 2016 Princeton University

Extraction of surface normal and index of refraction using a pair of passive infrared polarimetric sensors

Camera Calibration and Light Source Estimation from Images with Shadows

Overview of Active Vision Techniques

Problem Solving 10: Double-Slit Interference

Lecture 8 Active stereo & Volumetric stereo

3D Modeling of Objects Using Laser Scanning

2 Depth Camera Assessment

Comments on Consistent Depth Maps Recovery from a Video Sequence

Single slit diffraction

Chapter 37. Interference of Light Waves

Math Boot Camp: Coordinate Systems

CS 4495 Computer Vision A. Bobick. Motion and Optic Flow. Stereo Matching

X-ray tomography. X-ray tomography. Applications in Science. X-Rays. Notes. Notes. Notes. Notes

Transcription:

Structured Light

Structured Light The main problem of photogrammetry to recover shape from multiple views of a scene, we need to find correspondences between the images the matching/correspondence problem is hard the 3D object geometry cannot be reconstructed in image regions without well-defined image points Structured light idea: find ways to simplify matching and guarantee dense coverage with homologous points general strategy: use illumination to create your own correspondences probably the most robust method in practice widely used (industrial metrology, entertainment,...) also known as active stereo or white light scanning

Basic Principle Light projection use a projector to create unambiguous correspondences with these correspondences, apply conventional stereo if we project a single point, matching is unique... but many images needed to cover the object Note: less sophisticated variants include dusting surfaces with chalk, spraying them with water, etc.

Variant Pattern projection project a pattern instead of a single point needs only a single image, one-shot recording...but matching is no longer unique (although still easier) more on this later

Line projection Extension for a calibrated camera rig, the epipolar geometry is known can project a line instead of a single point a lot fewer images needed (one for all points on a line) the line intersects each epipolar line in one point matching is still unique

But we can do better Observation a projector is just an inverse camera, ray direction is reversed the projector is described by the same geometric model projected pattern and image define two rays in space projector and one camera are sufficient for triangulation

Calibrated projection Projector-camera system must be calibrated (otherwise would not be able to triangulate) the projected pattern is known thus the depth depends only on the image point location depth computation reduces to table-lookup

Laser stripe Line projection shine a laser through a cylindrical lens results in a planar sheet of light Example: the Digital Michelangelo Project laser line rigidly mounted to camera complete system is scanned across the object

Digital Michelangelo Project http://graphics.stanford.edu/projects/mich

Digital Michelangelo Project

Pattern projectors Project multiple stripes or a grid avoids scanning, respectively taking multiple images only a single image is required to cover the scene... but stripes/grid points are no longer unique Which stripe matches which?... the correspondence problem again

Pattern projectors Answer 1: assume surface continuity... the ordering constraint, as in conventional stereo

Pattern projectors Answer 1: assume surface continuity

Pattern projectors Answer 2: coloured stripes or dots can be designed such that local patterns are unambiguous (c.f. bar-codes) difficult to use for coloured surfaces

Pattern projectors Answer 2: coloured stripes or dots can be designed such that local patterns are unambiguous (c.f. bar-codes) difficult to use for coloured surfaces

Multi-stripe Projectors Answer 3: time-coded light patterns best compromise between single stripes and stripe pattern Use a sequence of binary patterns (log N) images Each stripe has a unique binary illumination code Time Space

Pattern projectors Answer 3: time-coded light patterns best compromise between single stripes and stripe pattern Use a sequence of binary patterns (log N) images Each stripe has a unique binary illumination code

Phase-shift projection Increasing the resolution project three phase-shifted sinusoidal patterns can be projected sequentially, or simultaneously in different colours the recorded intensities allow to compute the phase angle of a pixel within a wavelength

Phase-shift projection Phase angle from brightness values computing the phase angle from the three images although the method relies on brightness, the ambient light and the power of the projector need not be known observed intensities I = I base + I var cos(φ θ) I 0 = I base + I var cos(φ) I + = I base + I var cos(φ + θ)

Phase-shift projection Phase angle from brightness values computing the phase angle from the three images although the method relies on brightness, the ambient light and the power of the projector need not be known observed intensities I = I base + I var cos(φ θ) I 0 = I base + I var cos(φ) I + = I base + I var cos(φ + θ) removed dependence on I base removed dependence on I var I I + = 2I 0 I I + I base + I var cos(φ θ) I base I var cos(φ + θ) 2I base +2I var cos φ I base I var cos(φ θ) I base I var cos(φ + θ)

Phase-shift projection Phase angle from brightness values computing the phase angle from the three images although the method relies on brightness, the ambient light and the power of the projector need not be known observed intensities I = I base + I var cos(φ θ) I 0 = I base + I var cos(φ) I + = I base + I var cos(φ + θ) from trigonometry θ tan = 1 cos(θ) 2 sin(θ) cos(φ θ) = cos(φ) cos(θ) + sin(φ) sin(θ) cos(φ + θ) = cos(φ) cos(θ) sin(φ) sin(θ) cos(φ θ) cos(φ + θ) 2 cos φ cos(φ θ) cos(φ + θ) = 2 sin(φ) sin(θ) 2 cos(φ)(1 cos(θ))

Phase-shift projection Phase angle from brightness values computing the phase angle from the three images although the method relies on brightness, the ambient light and the power of the projector need not be known observed intensities I = I base + I var cos(φ θ) I 0 = I base + I var cos(φ) I + = I base + I var cos(φ + θ) from trigonometry θ tan = 1 cos(θ) 2 sin(θ) cos(φ θ) = cos(φ) cos(θ) + sin(φ) sin(θ) cos(φ + θ) = cos(φ) cos(θ) sin(φ) sin(θ) 2 sin(φ) sin(θ) 2 cos(φ)(1 cos(θ)) = tan(φ) sin(θ) 1 cos(θ) = tan(φ) tan(θ/2)

Phase-shift projection Phase angle from brightness values computing the phase angle from the three images although the method relies on brightness, the ambient light and the power of the projector need not be known observed intensities I = I base + I var cos(φ θ) I 0 = I base + I var cos(φ) I + = I base + I var cos(φ + θ) phase angle φ (0, 2π) = arctan I I + = tan(φ) 2I 0 I I + tan(θ/2) θ I I + tan 2 2I 0 I I +

Phase-shift projection Total phase the phase angle only determines the relative position within one cycle of the periodic sine wave need to know which stripe we are in (c.f. GPS phase ambiguity) achieved by ordering assumption, or combination with stereo

Phase-shift projection Total phase phase angle within a period from intensity number of period from stereo triangulation stereo matching is easy: only N possibilities absolute phase φ(x, y) =2π k(x, y)+φ (x, y) k [0...N 1] from stereo from phase-shift number of periods (stripes)

Phase-shift Projection Realtime in-hand scanning

Simple stripe projector Quite accurate systems surprisingly cheap http://www.david-laserscanner.com cost <100 CHF: webcam, handheld line-laser, calibration board software freely available from TU Braunschweig

Industrial systems Many vendors big business - maybe the success story of image-based metrology

Consumer application Now people have it in their living room Xbox Kinect - periodic infrared dot pattern

Structured Light Advantages robust - solves the correspondence problem fast - instantaneous recording, real-time processing Limitations less flexible than passive sensing: needs specialised equipment and suitable environment Applications industrial inspection entertainment healthcare heritage documentation...