Surround Structured Lighting for Full Object Scanning

Similar documents
Surround Structured Lighting for Full Object Scanning

Computer Vision and Image Understanding

ENGN 2911 I: 3D Photography and Geometry Processing Assignment 2: Structured Light for 3D Scanning

Structured Light II. Guido Gerig CS 6320, Spring (thanks: slides Prof. S. Narasimhan, CMU, Marc Pollefeys, UNC)

ENGN D Photography / Spring 2018 / SYLLABUS

Computational Optical Imaging - Optique Numerique. -- Active Light 3D--

Lecture 8 Active stereo & Volumetric stereo

Flexible Calibration of a Portable Structured Light System through Surface Plane

Optical Active 3D Scanning. Gianpaolo Palma

3D Scanning. Qixing Huang Feb. 9 th Slide Credit: Yasutaka Furukawa

ENGN2911I: 3D Photography and Geometry Processing Assignment 1: 3D Photography using Planar Shadows

Lecture 8 Active stereo & Volumetric stereo

Structured Light. Tobias Nöll Thanks to Marc Pollefeys, David Nister and David Lowe

Multi-view reconstruction for projector camera systems based on bundle adjustment

Introduction to 3D Machine Vision

Stereo. 11/02/2012 CS129, Brown James Hays. Slides by Kristen Grauman

3D Computer Vision. Structured Light I. Prof. Didier Stricker. Kaiserlautern University.

Stereo vision. Many slides adapted from Steve Seitz

3D Scanning Method for Fast Motion using Single Grid Pattern with Coarse-to-fine Technique

EECS 442 Computer vision. Stereo systems. Stereo vision Rectification Correspondence problem Active stereo vision systems

Multi-View Stereo for Community Photo Collections Michael Goesele, et al, ICCV Venus de Milo

3D Object Representations. COS 526, Fall 2016 Princeton University

Multiview Projectors/Cameras System for 3D Reconstruction of Dynamic Scenes

Image Based Reconstruction II

Stereo and structured light

Structured light , , Computational Photography Fall 2017, Lecture 27

Compact and Low Cost System for the Measurement of Accurate 3D Shape and Normal

Robust and Accurate One-shot 3D Reconstruction by 2C1P System with Wave Grid Pattern

3D Photography: Active Ranging, Structured Light, ICP

Multiple View Geometry

Computer Vision. 3D acquisition

Lecture 10: Multi view geometry

Binocular stereo. Given a calibrated binocular stereo pair, fuse it to produce a depth image. Where does the depth information come from?

Vision Review: Image Formation. Course web page:

HIGH SPEED 3-D MEASUREMENT SYSTEM USING INCOHERENT LIGHT SOURCE FOR HUMAN PERFORMANCE ANALYSIS

3D Photography: Stereo

Visual Hulls from Single Uncalibrated Snapshots Using Two Planar Mirrors

BIL Computer Vision Apr 16, 2014

Unstructured Light Scanning to Overcome Interreflections

Shape-from-Silhouette with Two Mirrors and an Uncalibrated Camera

Exploiting Mirrors in Interactive Reconstruction with Structured Light

High-resolution, real-time three-dimensional shape measurement

MERGING POINT CLOUDS FROM MULTIPLE KINECTS. Nishant Rai 13th July, 2016 CARIS Lab University of British Columbia

Multi-view stereo. Many slides adapted from S. Seitz

Hamming Color Code for Dense and Robust One-shot 3D Scanning

Real-time surface tracking with uncoded structured light

CS 4495 Computer Vision A. Bobick. Motion and Optic Flow. Stereo Matching

Other Reconstruction Techniques

Temporally-Consistent Phase Unwrapping for a Stereo-Assisted Structured Light System

Simple, Accurate, and Robust Projector-Camera Calibration. Supplementary material

Registration of Moving Surfaces by Means of One-Shot Laser Projection

3D Sensing. 3D Shape from X. Perspective Geometry. Camera Model. Camera Calibration. General Stereo Triangulation.

A High Speed Face Measurement System

L2 Data Acquisition. Mechanical measurement (CMM) Structured light Range images Shape from shading Other methods

A three-step system calibration procedure with error compensation for 3D shape measurement

Estimating the surface normal of artwork using a DLP projector

Stereo. Many slides adapted from Steve Seitz

Overview of Active Vision Techniques

Multi-View 3D-Reconstruction

Volumetric Scene Reconstruction from Multiple Views

A Literature Review on Low Cost 3D Scanning Using Structure Light and Laser Light Scanning Technology

An Energy Formulation for Establishing the Correspondence Used in Projector Calibration

3D data merging using Holoimage

Acquisition and Visualization of Colored 3D Objects

Embedded Phase Shifting: Robust Phase Shifting with Embedded Signals

Optimal Decoding of Stripe Patterns with Window Uniqueness Constraint

Partial Calibration and Mirror Shape Recovery for Non-Central Catadioptric Systems

Lecture 9 & 10: Stereo Vision

Image Transformations & Camera Calibration. Mašinska vizija, 2018.

Robot Vision: Camera calibration

Motion-Aware Structured Light Using Spatio-Temporal Decodable Patterns

Rectification and Disparity

Design and Calibration of a Network of RGB-D Sensors for Robotic Applications over Large Workspaces

Today. Stereo (two view) reconstruction. Multiview geometry. Today. Multiview geometry. Computational Photography

Color-Stripe Structured Light Robust to Surface Color and Discontinuity

DEVELOPMENT OF REAL TIME 3-D MEASUREMENT SYSTEM USING INTENSITY RATIO METHOD

3D Computer Vision 1

5LSH0 Advanced Topics Video & Analysis

3D Computer Vision. Depth Cameras. Prof. Didier Stricker. Oliver Wasenmüller

3D Models from Range Sensors. Gianpaolo Palma

CSCI 5980: Assignment #3 Homography

Space-Time Stereo. 1 Introduction. Spacetime Stereo. Paper ID: 521. Abstract

CS 4495 Computer Vision A. Bobick. Motion and Optic Flow. Stereo Matching

The main problem of photogrammetry

Image-Based Rendering

Optical Imaging Techniques and Applications

Optimized Projection Pattern Supplementing Stereo Systems

Three-dimensional data merging using holoimage

3D Sensing and Reconstruction Readings: Ch 12: , Ch 13: ,

Agenda. DLP 3D scanning Introduction DLP 3D scanning SDK Introduction Advance features for existing SDK

Structured Light II. Thanks to Ronen Gvili, Szymon Rusinkiewicz and Maks Ovsjanikov

Some books on linear algebra

Computer Vision Lecture 17

Fast projector-camera calibration for interactive projection mapping

Computer Vision Lecture 17

CS201 Computer Vision Camera Geometry

Calibration of a Different Field-of-view Stereo Camera System using an Embedded Checkerboard Pattern

Shape from Depth Discontinuities under Orthographic Projection

Epipolar Geometry in Stereo, Motion and Object Recognition

Shape from Silhouettes II

Transcription:

Surround Structured Lighting for Full Object Scanning Douglas Lanman, Daniel Crispell, and Gabriel Taubin Brown University, Dept. of Engineering August 21, 2007 1

Outline Introduction and Related Work System Design and Construction Calibration and Reconstruction Experimental Results Conclusions and Future Work Surround Structured Lighting 2

Review: Gray Code Structured Lighting Point Grey Flea2 (15 Hz @ 1024 x 768) Mitsubishi XD300U (50-85 Hz @ 1024 x 768) 3D Reconstruction using Structured Light [Inokuchi 1984] Recover 3D depth for each pixel using ray-plane intersection Determine correspondence between camera pixels and projector planes by projecting a temporally-multiplexed binary image sequence Each image is a bit-plane of the Gray code for each projector row/column Surround Structured Lighting References: [8,9] 3

Review: Gray Code Structured Lighting Point Grey Flea2 (15 Hz @ 1024 x 768) Mitsubishi XD300U (50-85 Hz @ 1024 x 768) 3D Reconstruction using Structured Light [Inokuchi 1984] Recover 3D depth for each pixel using ray-plane intersection Determine correspondence between camera pixels and projector planes by projecting a temporally-multiplexed binary image sequence Each image is a bit-plane of the Gray code for each projector row/column Encoding algorithm: integer row/column index binary code Gray code Surround Structured Lighting References: [8,9] 4

Recovery of Projector-Camera Correspondences Recovered Rows Recovered Columns 3D Reconstruction using Structured Light [Inokuchi 1984] Our implementation uses a total of 42 images (2 to measure dynamic range, 20 to encode rows, 20 to encode columns) Individual bits assigned by detecting if bit-plane (or its inverse) is brighter Decoding algorithm: Gray code binary code integer row/column index Surround Structured Lighting References: [8,9] 5

4 2 3 1 Overview of Projector-Camera Calibration 2 7 4 3 2 1 1 3 6 5 2 1 6 5 4 3 2 1 1 2 Estimated Camera Lens Distortion Camera Calibration Procedure Uses the Camera Calibration Toolbox for Matlab by J.-Y. Bouguet Normalized Ray Distorted Ray (4 th -order radial + tangential) Predicted Image-plane Projection Surround Structured Lighting References: [11,12,13] 6

Overview of Projector-Camera Calibration 4.5 3 2.5 2 1.5 1 4 3.5 3 2.5 2 1.5 1 0.5 3 2.5 2 1.5 1 0.5 3.5 4 3 4.5 2 1.5 1 0.5 0.5 Estimated Projector Lens Distortion Projector Calibration Procedure Consider projector as an inverse camera (i.e., maps intensities to 3D rays) Observe a calibration board with a set of fidicials in known locations Use fidicials to recover calibration plane in camera coordinate system Project a checkerboard on calibration board and detect corners Apply ray-plane intersection to recover 3D position for each projected corner Use Camera Calibration Toolbox to recover intrinsic/extrinsic projector calibration using 2D 3D correspondences with 4 th -order radial distortion Surround Structured Lighting References: [11,12,13] 7

Projector-Camera Calibration Y c (mm) 0 200 400 O c1 0 Z c1 Y c1 Xc1 500 X c (mm) Z p O p Y p X p Z c2 O c2 1000 0 Y c2 X c2 500 1000 1500 Z c (mm) Projector Calibration Procedure Observe a calibration board with a set of fidicials in known locations Use fidicials to recover calibration plane in camera coordinate system Project a checkerboard on calibration board and detect corners Apply ray-plane intersection to recover 3D position for each projected corner Use Camera Calibration Toolbox to recover intrinsic/extrinsic projector calibration using 2D 3D correspondences with 4 th -order radial distortion Surround Structured Lighting References: [11,12,13] 8

Gray Code Structured Lighting Results Surround Structured Lighting 9

Proposed Improvement: Surround Lighting Limitations of Structured Lighting Only recovers mutually-visible surface (i.e., must be illuminated and imaged) Complete model requires multiple scans or additional projectors/cameras Often requires post-processing (e.g., ICP) Proposed Solution Trade spatial for angular resolution Multiple views by including planar mirrors What about illumination inference? Use orthographic illumination System Components Multi-view: digital camera + planar mirrors Orthographic: DLP projector + Fresnel lens Surround Structured Lighting References: [1] 10

Related Work Structured Light for 3D Scanning Over 20 years of research [Salvi '04] Gray code sequences [Inokuchi '84] Recent real-time methods [Zhang '06] Including planar mirrors [Epstein '04] Multi-view using Planar Mirrors Visual Hull using mirrors [Forbes '06] Catadioptric Stereo [Gluckman '99] Mirror MoCap [Lin '02] Orthographic Projectors Recent work by Nayar and Anand on volumetric displays using passive optical scatterers [SIGGRAPH '06] Introduces orthographic projectors Surround Structured Lighting References: [2,3,4,7] 11

Outline Introduction and Related Work System Design and Construction Calibration and Reconstruction Experimental Results Conclusions and Future Work Surround Structured Lighting 12

Surround Structured Lighting Components Mitsubishi XD300U Projector (1024x786) Point Grey Flea2 Digital Camera (1024x786) Manfrotto 410 Compact Geared Tripod Head 11''x11'' Fresnel Lens (Fresnel Technologies #54) 15''x15'' First Surface Mirrors Newport Optics Kinematic Mirror Mounts Surround Structured Lighting References: [1] 13

Mechanical Alignment Procedure Manual Projector Alignment Center of projection must be at focal point of Frensel lens for orthographic configuration Given intrinsic projector calibration, we predict the projection of a known pattern on the surface of the Fresnel lens Projected Calibration Pattern Result of Mechanical Alignment (coincident projected and printed patterns) Printed Calibration Pattern (affixed to Frensel lens surface) Surround Structured Lighting References: [1] 14

Mechanical Alignment Procedure Manual Mirror Alignment Mirrors must be aligned such that plane spanned by surface normals is parallel to the orthographic illumination rays Projected Gray code stripe patterns assist in manually adjusting the mirror orientations Step 1: Alignment using a Flat Surface Cover each mirror with a blank surface Adjust the uncovered mirror so that the reflected and projected stripes coincide Step 2: Alignment using a Cylinder Place a blank cylindrical object in the center of the scanning volume Adjust both mirrors until the reflected stripes coincide on the cylinder surface Surround Structured Lighting References: [1] 15

Outline Introduction and Related Work System Design and Construction Calibration and Reconstruction Experimental Results Conclusions and Future Work Surround Structured Lighting 16

Orthographic Projector Calibration 500 Estimated Plane Coefficient (d) Coefficient Value 450 400 350 300 0 100 200 300 400 500 600 700 Projector Row Orthographic Projector Calibration using Structured Light Observe a checkerboard calibration pattern at several positions/poses Recover calibration planes in camera coordinate system Find camera pixel projector plane correspondence using Gray codes Apply ray-plane intersection to recover a labeled 3D point cloud Fit a plane to the set of all 3D points corresponding with each projector row Filter/extrapolate plane coefficients using a best-fit quadratic polynomial Surround Structured Lighting References: [12] 17

Planar Mirror Calibration Calibration Procedure Record planar checkerboard patterns (place against mirrors in two images) Find corners in real/reflected images Solve for checkerboard position/pose (also find initial mirror position/pose) Ray-trace through reflected corners Optimize {R M1,T M1 } to minimize backprojected checkerboard corner error Repeat for second mirror {R M2,T M2 } Mirror Camera Point Reflection Ray Reflection Surround Structured Lighting References: [1,7] 18

Reconstruction Algorithm Gray Code Sequence Recovered Projector Rows Step 1: Recover Projector Rows Project Gray code image sequence Recover projector scanline illuminating each pixel Post-process using image morphology Step 2: Recover 3D point cloud Reconstruct using ray-plane intersection Consider each real/virtual camera separately Assign per-point color using ambient image Real and Virtual Cameras Camera Centers Optical Rays Surround Structured Lighting References: [1] 19

Outline Introduction and Related Work System Design and Construction Calibration and Reconstruction Experimental Results Conclusions and Future Work Surround Structured Lighting 20

Experimental Reconstruction Results Ambient Illumination Gray Code Sequence Recovered Projector Rows Surround Structured Lighting 21

Outline Introduction and Related Work System Design and Construction Calibration and Reconstruction Experimental Results Conclusions and Future Work Surround Structured Lighting 22

Conclusions and Future Work Primary Accomplishments Experimentally demonstrated Surround Structured Lighting Developed a complete calibration procedure for prototype apparatus Secondary Accomplishments Proposed practical methods for orthographic projector construction/calibration Extended Camera Calibration Toolbox for general projector-camera calibration Future Work Sub-pixel light-plane localization Evaluate quantitative reconstruction accuracy Apply post-processing to point cloud (e.g., filtering, implicit surface, texture blending) Increase the scanning volume Flatbed scanner configuration (i.e., no projector) Extend to real-time shape acquisition in the round de Bruijn Pattern [Zhang '02] Surround Structured Lighting References: [16] 23

References 3DIM 2007: Surround Structured Lighting 1. D. Lanman, D. Crispell, and G. Taubin. Surround Structured Lighting for Full Object Scanning. 3DIM 2007. Related Work: Orthographic Projectors and Structured Light with Mirrors 2. S. K. Nayar and V. Anand. Projection Volumetric Display Using Passive Optical Scatterers. Technical Report, July 2006. 3. E. Epstein, M. Granger-Piché, and P. Poulin. Exploiting Mirrors in Interactive Reconstruction with Structured Light. Vision, Modeling, and Visualization 2004. Multi-view Reconstruction using Planar Mirrors 4. K. Forbes, F. Nicolls, G. de Jager, and A. Voigt. Shape-from-Silhouette with Two Mirrors and an Uncalibrated Camera. ECCV 2006. 5. J. Gluckman and S. Nayar. Planar Catadioptric Stereo: Geometry and Calibration. In CVPR 1999. 6. B. Hu, C. Brown, and R. Nelson. Multiple-view 3D Reconstruction Using a Mirror. Technical Report, May 2005. 7. I.-C. Lin, J.-S. Yeh, and M. Ouhyoung. Extracting Realistic 3D Facial Animation Parameters from Multi-view Video clips. IEEE Computer Graphics and Applications, 2002. Surround Structured Lighting 24

References 3D Reconstruction using Structured Light 8. J. Salvi, J. Pages, and J. Batlle. Pattern Codification Strategies in Structured Light Systems. Pattern Recognition, April 2004. 9. S. Inokuchi, K. Sato, and F. Matsuda. Range Imaging System for 3D Object Recognition. Proceedings of the International Conference on Pattern Recognition, 1984. Projector and Camera Calibration Methods 10. R. Legarda-Sáenz, T. Bothe, and W. P. Jüptner. Accurate Procedure for the Calibration of a Structured Light System. Optical Engineering, 2004. 11. R. Raskar and P. Beardsley. A Self-correcting Projector. CVPR 2001. 12. S. Zhang and P. S. Huang. Novel Method for Structured Light System Calibration. Optical Engineering, 2006. 13. J.-Y. Bouguet. Complete Camera Calibration Toolbox for Matlab. http://www.vision.caltech.edu/bouguetj/calib_doc. Visual Hull: Silhouette-based 3D Reconstruction 14. A. Laurentini. The Visual Hull Concept for Silhouette-based Image Understanding. IEEE Transactions on Pattern Analysis and Machine Intelligence, 1994. Surround Structured Lighting 25

References Real-time Shape Acquisition 15. S. Rusinkiewicz, O. Hall-Holt, and M. Levoy. Real-time 3D Model Acquisition. SIGGRAPH 2002. 16. L. Zhang, B. Curless, and S. M. Seitz. Rapid Shape Acquisition using Color Structured Light and Multi-pass Dynamic Programming. 3DPVT 2002. 17. S. Zhang and P. S. Huang. High-resolution, Real-time Three-dimensional Shape Measurement. Optical Engineering, 2006. Surround Structured Lighting 26