Depth Camera for Mobile Devices

Similar documents
Outline. ETN-FPI Training School on Plenoptic Sensing

Depth Sensors Kinect V2 A. Fornaser

3D Computer Vision. Depth Cameras. Prof. Didier Stricker. Oliver Wasenmüller

Range Sensors (time of flight) (1)

Fundamental Matrix & Structure from Motion

10/5/09 1. d = 2. Range Sensors (time of flight) (2) Ultrasonic Sensor (time of flight, sound) (1) Ultrasonic Sensor (time of flight, sound) (2) 4.1.

Lecture 19: Depth Cameras. Visual Computing Systems CMU , Fall 2013

New Sony DepthSense TM ToF Technology

Three-Dimensional Sensors Lecture 2: Projected-Light Depth Cameras

New Sony DepthSense TM ToF Technology

DD2423 Image Analysis and Computer Vision IMAGE FORMATION. Computational Vision and Active Perception School of Computer Science and Communication

Time-of-flight basics

Visual Perception Sensors

MULTI-MODAL MAPPING. Robotics Day, 31 Mar Frank Mascarich, Shehryar Khattak, Tung Dang

Pin Hole Cameras & Warp Functions

Pin Hole Cameras & Warp Functions

Computer Vision. 3D acquisition

Range Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation

All human beings desire to know. [...] sight, more than any other senses, gives us knowledge of things and clarifies many differences among them.

A Comparison between Active and Passive 3D Vision Sensors: BumblebeeXB3 and Microsoft Kinect

Active Stereo Vision. COMP 4900D Winter 2012 Gerhard Roth

Solid-State Hybrid LiDAR for Autonomous Driving Product Description

Epipolar geometry contd.

Other approaches to obtaining 3D structure

Stereo and structured light

Depth Measurement and 3-D Reconstruction of Multilayered Surfaces by Binocular Stereo Vision with Parallel Axis Symmetry Using Fuzzy

CS4495/6495 Introduction to Computer Vision

2 Depth Camera Assessment

PMD [vision] Day Vol. 3 Munich, November 18, PMD Cameras for Automotive & Outdoor Applications. ifm electronic gmbh, V.Frey. Dr.

Machine vision. Summary # 11: Stereo vision and epipolar geometry. u l = λx. v l = λy

3D Scanning. Qixing Huang Feb. 9 th Slide Credit: Yasutaka Furukawa

Laser sensors. Transmitter. Receiver. Basilio Bona ROBOTICA 03CFIOR

Introduction to Computer Vision. Week 8, Fall 2010 Instructor: Prof. Ko Nishino

Fundamental Matrix & Structure from Motion

Theory of Stereo vision system

Stereo. 11/02/2012 CS129, Brown James Hays. Slides by Kristen Grauman

ROBOTICS 01PEEQW. Basilio Bona DAUIN Politecnico di Torino

CS201 Computer Vision Lect 4 - Image Formation

Basilio Bona DAUIN Politecnico di Torino

3D Time-of-Flight Image Sensor Solutions for Mobile Devices

Design and Calibration of a Network of RGB-D Sensors for Robotic Applications over Large Workspaces

Structured Light. Tobias Nöll Thanks to Marc Pollefeys, David Nister and David Lowe

TRANSIENT IMAGING. Freshman Imaging Project 2014

3D Photography: Stereo

Time-of-Flight Imaging!

Stereo Vision A simple system. Dr. Gerhard Roth Winter 2012

Lecture 14: Basic Multi-View Geometry

Computer Vision Lecture 17

Computer Vision Lecture 17

COMP 102: Computers and Computing

CS5670: Computer Vision

Studying Dynamic Scenes with Time of Flight Cameras

Flexible Calibration of a Portable Structured Light System through Surface Plane

3D Computer Vision 1

Recap from Previous Lecture

Multiple View Geometry

LUMS Mine Detector Project

Computer Vision. Introduction

3D Camera for a Cellular Phone. Deborah Cohen & Dani Voitsechov Supervisor : Raja Giryes 2010/11

MERGING POINT CLOUDS FROM MULTIPLE KINECTS. Nishant Rai 13th July, 2016 CARIS Lab University of British Columbia

Stereo CSE 576. Ali Farhadi. Several slides from Larry Zitnick and Steve Seitz

Non-line-of-sight imaging

Dynamic Reconstruction for Coded Aperture Imaging Draft Unpublished work please do not cite or distribute.

Planar homographies. Can we reconstruct another view from one image? vgg/projects/singleview/

3D Modeling of Objects Using Laser Scanning

L2 Data Acquisition. Mechanical measurement (CMM) Structured light Range images Shape from shading Other methods

Visual Pathways to the Brain

Two-view geometry Computer Vision Spring 2018, Lecture 10

Scene Reconstruction from Uncontrolled Motion using a Low Cost 3D Sensor

Stereo II CSE 576. Ali Farhadi. Several slides from Larry Zitnick and Steve Seitz

Other Reconstruction Techniques

A study of a multi-kinect system for human body scanning

Today. Stereo (two view) reconstruction. Multiview geometry. Today. Multiview geometry. Computational Photography

Project 2 due today Project 3 out today. Readings Szeliski, Chapter 10 (through 10.5)

3D Photography: Active Ranging, Structured Light, ICP

Epipolar Geometry and Stereo Vision

Dr. Larry J. Paxton Johns Hopkins University Applied Physics Laboratory Laurel, MD (301) (301) fax

Starting this chapter

A Low Power, High Throughput, Fully Event-Based Stereo System: Supplementary Documentation

1 (5 max) 2 (10 max) 3 (20 max) 4 (30 max) 5 (10 max) 6 (15 extra max) total (75 max + 15 extra)

ENY-C2005 Geoinformation in Environmental Modeling Lecture 4b: Laser scanning

Sensor technology for mobile robots

Multiple View Geometry

Overview of Active Vision Techniques

Stereo Vision. MAN-522 Computer Vision

Fundamentals of Stereo Vision Michael Bleyer LVA Stereo Vision

An idea which can be used once is a trick. If it can be used more than once it becomes a method

Ceilbot vision and mapping system

DEVELOPMENT OF LARGE SCALE STRUCTURED LIGHT BASED MEASUREMENT SYSTEMS

3D Computer Vision. Structured Light I. Prof. Didier Stricker. Kaiserlautern University.

Radiance. Pixels measure radiance. This pixel Measures radiance along this ray

3D Shape and Indirect Appearance By Structured Light Transport. Authors: O Toole, Mather, and Kutulakos Presented by: Harrison Billmers, Allen Hawkes

Stereo vision. Many slides adapted from Steve Seitz

Epipolar Geometry CSE P576. Dr. Matthew Brown

Epipolar Geometry and Stereo Vision

CSE 4392/5369. Dr. Gian Luca Mariottini, Ph.D.

Lecture'9'&'10:'' Stereo'Vision'

Direct Plane Tracking in Stereo Images for Mobile Navigation

Binocular stereo. Given a calibrated binocular stereo pair, fuse it to produce a depth image. Where does the depth information come from?

Sensor Modalities. Sensor modality: Different modalities:

Transcription:

Depth Camera for Mobile Devices Instructor - Simon Lucey 16-423 - Designing Computer Vision Apps

Today Stereo Cameras Structured Light Cameras Time of Flight (ToF) Camera

Inferring 3D Points Given we have prior knowledge of the, Intrinsics parameters, { j } J j=1 Extrinsic parameters, { j, j } J j=1 Corresponding points, {x j } J j=1 Question is how to estimate the 3D point w?

Inferring 3D Points ŵ =min w JX j=1 {x j pinhole[w, j, j, j ]} e.g. {x} = x 2 2

Inferring 3D Points Optimization problem is inherently non-linear due to the pinhole camera function. Can be made linear using homogeneous coordinates.

Inferring 3D Points Write j-th out the pinhole camera in homogenous coordinates, Pre-multiply with inverse of the intrinsics matrix,

Inferring 3D Points Last equation gives, Substituting back into the other two equations, Re-arranging gives the following system of equations,

Inferring 3D Points Last equation gives, Substituting back into the other two equations, Re-arranging gives the following system of equations, What is the minimum number of cameras (J)?

Stereo Camera

Stereo Camera

Stereo Camera 6.35 cm

Stereo Camera 6.35 cm What is better wide or narrow baseline?

Stereo Camera

Stereo Camera

Amazon Fire Phone Examples in Mobile

Examples in Mobile Amazon Fire Phone Why 4 cameras?

Limitations - Texture Approach only works if an image patch has texture!! X A( x) = I(x k ) I(x k + x) 2 x k 2N (x) 5 10 15 20 25 5 10 15 20 25 5 10 15 20 25 5 10 15 20 25 A( x) 12

Limitations - Texture Approach only works if an image patch has texture!! X A( x) = I(x k ) I(x k + x) 2 x k 2N (x) 5 10 15 20 25 5 10 15 20 25 5 10 15 20 25 5 10 15 20 25 A( x) 12

Today Stereo Cameras Structured Light Cameras Time of Flight (ToF) Camera

Projector vs.camera 14

Projector vs.camera Camera 14

Projector vs. Camera 15

Projector vs. Camera Projector 15

Depth from Structured Light 16

Depth from Structured Light How can we get away with one camera? 16

Depth from Structure Light 17

Depth from Structured Light 18

Prime Sense - Kinect 1.0 Camera How pattern looks like? First Region: Allows to obtain a high accurate depth surface for near objects aprox. (0.8 1.2 m) Second Region: Allows to obtain medium accurate depth surface aprox. (1.2 2.0 m). Third Region: Allows to obtain a low accurate depth surface in far objects aprox. (2.0 3.5 m). 19

Examples in Mobile 20

ItSeez - App

ItSeez - App

Limitations - Range 22

Limitations - DeFocus November 6, 2015 DRAFT (a) Scene (b) Disparity Map Figure 1.2: Illumination Defocus: In (a) a checkerboard pattern is projected onto a scene using a DMD projector. The scene consists of three planar targets at distances of 60cm, 80cm and 120cm from the projector-camera system. The pattern is focused on the nearest plane. Due to the shallow depth of field, the pattern is poorly focused at other distances. Running structured light on this scene yields poor results (b) because of illumination defocus. The foreground of the scene is recontructed accurately because it is well focused, but the rest of the scene is reconstructed 23 poorly.

Limitations - Ambient Light A sunny day on Earth can reach up to 1120Wm -2 Tabletop projector releases on average 10W of light. Spectral Irradiance (in Wm 2 nm 1 ) 2.5 2 1.5 1 0.5 Extraterrestrial Radiation Direct + Circumsolar Irradiance 0 0 500 1000 1500 2000 2500 3000 3500 4000 Wavelength (in nm) 24

Today Stereo Cameras Structured Light Cameras Time of Flight (ToF) Camera

Time of Flight Cameras Light travels at approximately a constant speed c = 3x10 8 ms -1. Measuring the time it takes for light to travel over a distance once can infer distance. Can be categorized into two types:- 1. Direct TOF - switch laser on and off rapidly. 2. Indirect TOF - send out modulated light, then measure phase difference to infer depth.

Direct - TOF Light Detection And Ranging (LiDAR) probably best example in computer vision and robotics. High-energy light pulses limit influence of background illumination. However, difficulty to generate short light pulses with fast rise and fall times. High-accuracy time measurement required. Prone to motion blur. Expensive.

Direct - TOF Light Detection And Ranging (LiDAR) probably best example in computer vision and robotics. High-energy light pulses limit influence of background illumination. However, difficulty to generate short light pulses with fast rise and fall times. High-accuracy time measurement required. Prone to motion blur. Expensive.

Direct TOF - Zebedee CSIRO

Direct TOF - Zebedee CSIRO

Indirect - TOF Continuous light waves instead of short light pulses. Modulation in terms of frequency of sinusoidal waves. Detected wave after reflection has shifted phase. Phase shift proportional to distance from reflecting surface.... continuous wave... 20 MHz Emitter Detector Phase Meter...... phase shift 3D Surface

Indirect - TOF

Indirect TOF

Examples - Mobile

REAL3 TM Image Sensor

The Future

The Future