Image Based Rendering

Similar documents
Image Processing: Motivation Rendering from Images. Related Work. Overview. Image Morphing Examples. Overview. View and Image Morphing CS334

Image-Based Rendering

Image-Based Modeling and Rendering. Image-Based Modeling and Rendering. Final projects IBMR. What we have learnt so far. What IBMR is about

A Review of Image- based Rendering Techniques Nisha 1, Vijaya Goel 2 1 Department of computer science, University of Delhi, Delhi, India

Image Processing: 3D Image Warping. Spatial Transformations. Spatial Transformations. Affine Transformations. Affine Transformations CS334

Jingyi Yu CISC 849. Department of Computer and Information Science

Light Fields. Johns Hopkins Department of Computer Science Course : Rendering Techniques, Professor: Jonathan Cohen

Hybrid Rendering for Collaborative, Immersive Virtual Environments

Image Base Rendering: An Introduction

Image-Based Modeling and Rendering

Stereo pairs from linear morphing

Image-based modeling (IBM) and image-based rendering (IBR)

Real-time Generation and Presentation of View-dependent Binocular Stereo Images Using a Sequence of Omnidirectional Images

Image-Based Rendering using Image-Warping Motivation and Background

More and More on Light Fields. Last Lecture

Modeling Light. On Simulating the Visual Experience

Image Morphing. Application: Movie Special Effects. Application: Registration /Alignment. Image Cross-Dissolve

Image Transfer Methods. Satya Prakash Mallick Jan 28 th, 2003

The Light Field and Image-Based Rendering

A Warping-based Refinement of Lumigraphs

Stereo vision. Many slides adapted from Steve Seitz

CSc Topics in Computer Graphics 3D Photography

3D Sensing and Reconstruction Readings: Ch 12: , Ch 13: ,

BIL Computer Vision Apr 16, 2014

Morphable 3D-Mosaics: a Hybrid Framework for Photorealistic Walkthroughs of Large Natural Environments

Multiple View Geometry

COSC579: Scene Geometry. Jeremy Bolton, PhD Assistant Teaching Professor

A million pixels, a million polygons. Which is heavier? François X. Sillion. imagis* Grenoble, France

But, vision technology falls short. and so does graphics. Image Based Rendering. Ray. Constant radiance. time is fixed. 3D position 2D direction

ECE-161C Cameras. Nuno Vasconcelos ECE Department, UCSD

Lecture 15: Image-Based Rendering and the Light Field. Kayvon Fatahalian CMU : Graphics and Imaging Architectures (Fall 2011)

CSL 859: Advanced Computer Graphics. Dept of Computer Sc. & Engg. IIT Delhi

Volumetric Scene Reconstruction from Multiple Views

But First: Multi-View Projective Geometry

Lecture 19: Depth Cameras. Visual Computing Systems CMU , Fall 2013

Image-Based Rendering. Johns Hopkins Department of Computer Science Course : Rendering Techniques, Professor: Jonathan Cohen

Introduction to 3D Machine Vision

DD2423 Image Analysis and Computer Vision IMAGE FORMATION. Computational Vision and Active Perception School of Computer Science and Communication

Multi-View Stereo for Static and Dynamic Scenes

IMAGE-BASED RENDERING TECHNIQUES FOR APPLICATION IN VIRTUAL ENVIRONMENTS

Computational Photography

Image-Based Rendering and Modeling. IBR Approaches for View Synthesis

Optimizing an Inverse Warper

Image or Object? Is this real?

CS 684 Fall 2005 Image-based Modeling and Rendering. Ruigang Yang

Range Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation

Modeling Light. Michal Havlik : Computational Photography Alexei Efros, CMU, Fall 2007

Overview of Active Vision Techniques

Image-Based Rendering and Light Fields

Multi-view stereo. Many slides adapted from S. Seitz

Acquisition and Visualization of Colored 3D Objects

Image-Bas ed R endering Using Image Warping. Conventional 3-D Graphics

Rendering. Converting a 3D scene to a 2D image. Camera. Light. Rendering. View Plane

Computer Vision for Computer Graphics

Triangle Rasterization

DEPTH, STEREO AND FOCUS WITH LIGHT-FIELD CAMERAS

3D Rendering and Ray Casting

Reminder: Lecture 20: The Eight-Point Algorithm. Essential/Fundamental Matrix. E/F Matrix Summary. Computing F. Computing F from Point Matches

Image Based Reconstruction II

CS664 Lecture #19: Layers, RANSAC, panoramas, epipolar geometry

Image-Based Rendering. Johns Hopkins Department of Computer Science Course : Rendering Techniques, Professor: Jonathan Cohen

Computer Vision. 3D acquisition

Topics and things to know about them:

Image-Based Rendering. Image-Based Rendering

Image-Based Modeling and Rendering

A million pixels, a million polygons: which is heavier?

3D Rendering and Ray Casting

PERFORMANCE CAPTURE FROM SPARSE MULTI-VIEW VIDEO

Stereo. 11/02/2012 CS129, Brown James Hays. Slides by Kristen Grauman

3D Computer Vision. Depth Cameras. Prof. Didier Stricker. Oliver Wasenmüller

Capturing light. Source: A. Efros

Ray tracing based fast refraction method for an object seen through a cylindrical glass

L2 Data Acquisition. Mechanical measurement (CMM) Structured light Range images Shape from shading Other methods

IRW: An Incremental Representation for Image-Based Walkthroughs

Model-Based Stereo. Chapter Motivation. The modeling system described in Chapter 5 allows the user to create a basic model of a

There are many cues in monocular vision which suggests that vision in stereo starts very early from two similar 2D images. Lets see a few...

Focal stacks and lightfields

Modeling Light. Slides from Alexei A. Efros and others

5LSH0 Advanced Topics Video & Analysis

Multiple Views Geometry

Computational Photography

EE795: Computer Vision and Intelligent Systems

Stereo CSE 576. Ali Farhadi. Several slides from Larry Zitnick and Steve Seitz

Image-Based Deformation of Objects in Real Scenes

Depth. Chapter Stereo Imaging

lecture 10 - depth from blur, binocular stereo

Computer Graphics. Bing-Yu Chen National Taiwan University The University of Tokyo

calibrated coordinates Linear transformation pixel coordinates

Processing 3D Surface Data

The Viewing Pipeline Coordinate Systems

Midterm Examination CS 534: Computational Photography

Traditional pipeline for shape capture. Active range scanning. New pipeline for shape capture. Draw contours on model. Digitize lines with touch probe

DEVELOPMENT OF REAL TIME 3-D MEASUREMENT SYSTEM USING INTENSITY RATIO METHOD

Introduction to 3D Concepts

Challenges and advances in interactive modeling

3-D Shape Reconstruction from Light Fields Using Voxel Back-Projection

High-Quality Interactive Lumigraph Rendering Through Warping

Multimedia Technology CHAPTER 4. Video and Animation

3D Sensing. 3D Shape from X. Perspective Geometry. Camera Model. Camera Calibration. General Stereo Triangulation.

Page 1. Area-Subdivision Algorithms z-buffer Algorithm List Priority Algorithms BSP (Binary Space Partitioning Tree) Scan-line Algorithms

Transcription:

Image Based Rendering an overview Photographs We have tools that acquire and tools that display photographs at a convincing quality level 2 1

3 4 2

5 6 3

7 8 4

9 10 5

Photographs We have tools that acquire and tools that display photographs at a convincing quality level, for almost 100 years now 11 Sergei Mikhailovich Prokudin-Gorskii. A Settler's Family, ca. 1907-1915. 12 6

Sergei Mikhailovich Prokudin-Gorskii. Tea Factory in Chakva. Chinese Foreman Lau-Dzhen-Dzhau. ca. 1907-1915. 13 Sergei Mikhailovich Prokudin-Gorskii. The Emir of Bukhara, 1911. 14 7

RGB in early 1900 s 15 Plenoptic function Defines all the rays through any point in space (x, y, z) with any orientation (θ, φ) over all wavelenghts (λ) at any given moment in time (t) ρ = P( x, y, z, φ, ϕ, λ, t) 16 8

IBR summary Representation of plenoptic function implicit explicit geometric model texture ma apping pano oramas view mor rphing 3D image warping ray dat abases 17 Lightfield Lumigraph approach [Levoy96, Gortler96] Take all photographs you will ever need to display Model becomes database of rays Rendering becomes database querying 18 9

Introduction Overview Lightfield Lumigraph definition construction compression 19 Overview Introduction Lightfield Lumigraph definition construction compression 20 10

ρ = From 7D to 4D P ( x, y, z, φ, ϕ, λ, t) ) Static scene, t constant λ approximated with RGB consider only convex hull of objects, so the origin of the ray does not matter 21 4D Lightfield / Lumigraph 22 11

Discreet 4D Lightfield 23 Lightfield: set of images with COPs on regular grid 24 12

or Lightfield: set of images of a point seen at various angles 25 Depth correction of rays 26 13

Introduction Overview Lightfield Lumigraph definition construction compression 27 Overview Introduction Lightfield Lumigraph definition construction compression 28 14

Construction from dense set of photographs 29 Construction from sparse set of photographs camera positions acquisition stage blue screening space carving 30 15

Filling in gaps using pull-push algorithm Pull phase low res levels are created gaps are shrunk Push phase gaps at high res levels are filled using low res levels 31 Overview Introduction Lightfield Lumigraph definition construction compression 32 16

Introduction Overview Lightfield Lumigraph definition construction compression 33 Compression Large size uncompressed: 1.125GB 125GB 32x32 (s, t) x 256x256 (u, v) x 6 faces x 3 B Compression jpeg + mpeg (200:1 to 6MB) or vector quantization + entropy encoding 34 17

Vector Quantization (VQ) Principle codebook made of codewords replace actual word with closest codeword Implementation training on representative set of words to derive best codebook compression: replacing word with index to closest codeword decompression: retrieve indexed codeword from codebook 35 Lightfield compression using VQ 36 18

View morphing 37 Motivation rendering from images [Seitz96] Given left image right image Create intermediate images simulates camera movement 38 19

Previous work Panoramas ([Chen95], etc) user can look in any direction at few given locations Image-morphing ([Wolberg90], [Beier92], etc) linearly interpolated intermediate positions of features input: two images and correspondences output: metamorphosis of one image into other as sequence of intermediate images 39 Previous work limitations Panoramas ([Chen95], etc.) no camera translations allowed Image morphing ([Wolberg90], [Beier92], etc.) not shape-preserving image morphing is also a morph of the object to simulate rendering with morphing, the object should be rigid when camera moves 40 20

Introduction Image morphing View morphing image pre-warping image morphing image post-warping Overview 41 Introduction Image morphing View morphing image pre-warping image morphing image post-warping Overview 42 21

Image morphing 1. Correspondences 43 Image morphing 1. Correspondences 44 22

Image morphing 1. Correspondences 45 Image morphing 1. Correspondences 46 23

Image morphing P 0 P k P n 1. Correspondences 2. Linear interpolation k n k n k = (1 ) P0+ Pn P frame 0 frame k frame n 47 Image morphing Image morphing not shape preserving 48 24

Early IBR research Soft watch at moment of first explosion Salvador Dali 1954 49 Introduction Image morphing View morphing image pre-warping image morphing image post-warping Overview 50 25

Introduction Image morphing View morphing image pre-warping image morphing image post-warping Overview 51 View morphing Shape preserving morph Three step algorithm 1. Prewarp first and last images to parallel views 2. Image morph between prewarped images 3. Postwarp to interpolated view 52 26

Step 1: prewarp to parallel views P I 0 I 0p P 0p C 0 P 0 I n P n P np Inp Parallel views same image plane image plane parallel to segment connecting the two centers of projection Prewarp compute parallel views I 0p, I np rotate tt I 0 and di n to parallel lviews prewarp corrs. (P 0, P n ) -> (P op, P np ) C n 53 Step 2: morph parallel images P I 0 P 0 P n I n Shape preserving Use prewarped correspondences Interpolate C k from C 0 C n P 0p P kp I 0p P np I np C 0 C k C n 54 27

Step 3: Postwarping I 0 I 0p I n Postwarp morphed image create intermediate view C k is known interpolate view direction and tilt rotate morphed image to intermediate view C 0 C k C n 55 View morphing View morphing shape preserving 56 28

Overview Introduction Image morphing View morphing, more details image pre-warping image morphing image post-warping 57 Step 1: prewarp to parallel views P I 0 I n P 0p P 0 P n Parallel views use C 0 C n for x (a p vector) use (a 0 x b 0 ) x (a n x b n ) as y (-b p ) pick a p and b p to resemble a 0 b 0 as much as possible I 0p P np use same pixel size use wider field of view C 0 Inp C n 58 29

Step 1: prewarp to parallel views P I 0p I 0 P 0 P 0p C 0 I n P n P np Inp C n prewarping using texture mapping create polygon for image plane consider it texture mapped with the image itself render the scene from prewarped view if you go this path you will have to implement clipping with the COP plane you have texture mapping already alternative: prewarping using reprojection of rays look up all the rays of the prewarped view in the original view 59 Step 1: prewarp to parallel views P I 0 P 0 P 0p P n I n prewarping correspondences for all pairs of correspondence P 0 P n project P 0 on I 0p, computing P 0p project P n on I np, computing P np prewarped correspondence is P op P np I 0p P np C 0 Inp C n 60 30

Step 2: morph parallel images I 0p I 0 P 0 P 0p P kp C 0 C k P I n P n P np C n I np Image morphing use prewarped correspondences to compute a correspondence for all pixels in I 0p linearly interpolate I 0p to intermediate positions useful observation corresponding pixels are on same line in prewarped views preventing holes use larger footprint (ex 2x2) or linearly interpolate between consecutive samples or postprocess morphed image looking for background pixels and replacing them with neighboring values visibility artifacts collision of samples zbuffer on disparity holes morph I np to I kp use additional views 61 Step 3: Postwarping I 0 I 0p C 0 C k C n I n create intermediate view C k is known current view direction is a linear interpolation of the start and end view directions current up vector is a linear interpolation of the start and end up vectors rotate morphed image to intermediate view same as prewarping 62 31

Image-Based Rendering by Warping 63 Overview Introduction Depth extraction methods Reconstruction for IBRW Visibility without depth Sample selection 64 32

Introduction Overview comparison to other IBR methods 3D warping equation reconstruction 65 IBR by Warping (IBRW) Images enhanced with per-pixel pixel depth [McMillan95] u 1 v 1 P1 66 33

IBR by Warping (IBRW) 1 u 1 1 C1P1 P = C 1 + ( c1 + u 1 a1 v 1b1 ) P ( + w w = C P 1 v 1 1/w 1 also called generalized disparity another notation δ(u 1, v 1 ) P1 C 1 67 IBR by Warping (IBRW) = C 1 + ( c 1 + u 1 a 1 + v 1 b 1 ) P ( + w P = C 2 + ( c2 + u2 a2 + v2b2) w2 u 1 P u 2 1 v 1 P 2 v 2 P1 C 1 C 2 68 34

u v 2 3D warping equations = = w w w 11 + w 12 u 1 + w 13 v 1 + w 14 v 31 + w + w 32 u u 1 + w + w δ ( u 1, 1 ) δ ( u, v ) δ ( u, v 21 22 1 23 1 24 1 1 2 w31 + w32 u1 + w33 v1 + w34 δ ( u1, v1) u 1 P u 2 33 v 1 v + w + w 34 1 1 ) v 1 v 2 C 1 C 2 69 A complete IBR method Façade system (coarse) geometric model needed Panoramas viewer confined to center of panorama View morphing correspondences needed IBRW rendering to arbitrary new views 70 35

DeltaSphere - depth&color acquisition device Lars Nyland et al. 71 72 36

73 74 37

75 Reconstructing by splatting Estimate shape and size of footprint of warped samples expensive to do accurately lower image quality if crudely approximated Samples are z-buffered 76 38

Introduction Overview Depth extraction methods Reconstruction for IBRW Visibility without depth Sample selection 77 Finding depth 78 39

Depth from stereo Overview Depth from structured light Depth from focus / defocus Laser rangefinders 79 Overview Depth from stereo Depth from structured light Depth from focus / defocus Laser rangefinders 80 40

Depth from stereo two cameras with known parameters infer 3D location of point seen in both images sub problem: correspondences for a point seen in the left image, find its projection in the right image u 1 P u 2 v 1 P 1 P2 v 2 C 1 C 2 81 Depth from stereo: déjà-vu math = C 1 + c1 + u1a1 v1b1 ) P ( + w = C 2 + c2 + u2 a2 v2b2) P ( + w 1 2 u 1 P u 2 unknowns are w 1 and w 2 overconstrained system v 1 P 1 P 2 v 2 the u 2v 2 coordinates of a point seen at u 1 v 1 are constrained to an epipolar line C 1 C 2 82 41

Epipolar line C 1, C 2, P 1 define a plane P 2 will be on that plane P 2 is also on the image plane 2 So P 2 will be on the line defined by the two planes intersection u 1 P u 2 v 1 P 2 v 2 P 1 C 1 C 2 83 Search for correspondences on epipolar line Reduces the dimensionality of the search space Walk on epipolar segment rather than search in entire image u 1 P u 2 v 1 P 2 v 2 P 1 C 1 C 2 84 42

Parallel views Preferred stereo configuration epipolar lines are horizontal, easy to search u 1 u 2 P v 1 C 1 v 1 C 2 85 Parallel views Limit search to epipolar segment from u 2 =u 1 (P is infinitely far away) to 0 (P is close) u 1 v 1 u 2 P C 1 v 1 C 2 u 1 86 43

Depth precision analysis 1/z linear with disparity (u 1 u 2 ) better depth resolution for nearby objects important to determine correspondences with subpixel accuracy u 1 u 2 P v 1 C 1 v 1 C 2 87 Overview Depth from stereo Depth from structured light Depth from focus / defocus Laser rangefinders 88 44

Depth from stereo Overview Depth from structured light Depth from focus / defocus Laser rangefinders 89 Depth from stereo problem Correspondences are difficult to find Structured light approach replace one camera with projector project easily detectable patterns establishing correspondences becomes a lot easier 90 45

Depth from structured light = C 1 + c1 + u1a1 v1b1 ) P ( + w P = C 2 + ( c2 + u2 a2 + v2b2) w2 C1 is a projector Projects a pattern centered at u 1 v 1 u Pattern center hits object scene at 1 P u 2 P Camera C 2 sees pattern at u 2 v 2, v easy to find v 2 1 3D location of P is determined P 1 P2 1 C 1 C 2 91 46

93 Depth from structured light challenges Associated with using projectors expensive, cannot be used outdoors, not portable Difficult to identify pattern I found a corner, which corner is it? Invasive, change the color of the scene one could use invisible light, IR 94 47

Depth from stereo Overview Depth from structured light Depth from focus / defocus Laser rangefinders 95 Overview Depth from stereo Depth from structured light Depth from focus / defocus Laser rangefinders 96 48

Depth of field Thin lenses rays through lens center (C) do not change direction rays parallel to optical axis go through focal point (F ) aperture object F C F image 97 For a given focal length, only objects that are at a certain depth are in focus Depth of field image plane aperture object F C F 98 49

Out of focus When object at different depth One point projects to several locations in the image Out of focus, blurred image image plane aperture object F C F 99 Move lens to focus for new depth Relationship between focus and depth can be exploited dto extract tdepth Focusing image plane aperture object F C F 100 50

Determine z for points in focus a hi a + = = f h z f image plane aperture a f object F h h i C F z 101 Depth from defocus Take images of a scene with various camera parameters Measuring defocus variation, infer range to objects Does not need to find the best focusing planes for the various objects Examples by Shree Nayar, Columbia U 102 51

Depth from stereo Overview Depth from structured light Depth from focus / defocus Laser rangefinders 103 Overview Depth from stereo Depth from structured light Depth from focus / defocus Laser rangefinders 104 52

Laser range finders Send a laser beam to measure the distance like RADAR, measures time of flight 105 DeltaSphere - depth&color acquisition device Lars Nyland et al. courtesy 3 rd Tech Inc. 106 53

300 o x 300 o panorama this is the reflected light 107 300 o x 300 o panorama this is the range light 108 54

spherical range panoramas courtesy 3 rd Tech Inc. planar re-projection 109 Jeep one scan courtesy 3 rd Tech Inc. 110 55

Jeep one scan courtesy 3 rd Tech Inc. 111 Complete Jeep model courtesy 3 rd Tech Inc. 112 56

113 57