Real-Time Virtual Viewpoint Generation on the GPU for Scene Navigation

Similar documents
Real-Time Virtual Viewpoint Generation on the GPU for Scene Navigation

Colour Segmentation-based Computation of Dense Optical Flow with Application to Video Object Segmentation

CS4495/6495 Introduction to Computer Vision. 3B-L3 Stereo correspondence

Fast HDR Image-Based Lighting Using Summed-Area Tables

Image Processing: Motivation Rendering from Images. Related Work. Overview. Image Morphing Examples. Overview. View and Image Morphing CS334

Broad field that includes low-level operations as well as complex high-level algorithms

CS 498 VR. Lecture 20-4/11/18. go.illinois.edu/vrlect20

The Light Field and Image-Based Rendering

LOD and Occlusion Christian Miller CS Fall 2011

Interactive Non-Linear Image Operations on Gigapixel Images

Stereo. Outline. Multiple views 3/29/2017. Thurs Mar 30 Kristen Grauman UT Austin. Multi-view geometry, matching, invariant features, stereo vision

Stereo: Disparity and Matching

CS5670: Computer Vision

Multi-view stereo. Many slides adapted from S. Seitz

CS 4495 Computer Vision A. Bobick. Motion and Optic Flow. Stereo Matching

Image-Based Rendering. Johns Hopkins Department of Computer Science Course : Rendering Techniques, Professor: Jonathan Cohen

Image Transfer Methods. Satya Prakash Mallick Jan 28 th, 2003

Stereo vision. Many slides adapted from Steve Seitz

An Algorithm for Seamless Image Stitching and Its Application

Real-time Generation and Presentation of View-dependent Binocular Stereo Images Using a Sequence of Omnidirectional Images

Computergrafik. Matthias Zwicker. Herbst 2010

Announcements. Mosaics. Image Mosaics. How to do it? Basic Procedure Take a sequence of images from the same position =

STRUCTURAL EDGE LEARNING FOR 3-D RECONSTRUCTION FROM A SINGLE STILL IMAGE. Nan Hu. Stanford University Electrical Engineering

Deferred Adaptive Compute Shading. Utah Graphics

Persistent Background Effects for Realtime Applications Greg Lane, April 2009

Hybrid Rendering for Collaborative, Immersive Virtual Environments

BSB663 Image Processing Pinar Duygulu. Slides are adapted from Selim Aksoy

ФУНДАМЕНТАЛЬНЫЕ НАУКИ. Информатика 9 ИНФОРМАТИКА MOTION DETECTION IN VIDEO STREAM BASED ON BACKGROUND SUBTRACTION AND TARGET TRACKING

Binocular stereo. Given a calibrated binocular stereo pair, fuse it to produce a depth image. Where does the depth information come from?

Notes 9: Optical Flow

Mattan Erez. The University of Texas at Austin

On the Use of Ray-tracing for Viewpoint Interpolation in Panoramic Imagery

Occlusion Detection of Real Objects using Contour Based Stereo Matching

Amortized Supersampling

CS 4495 Computer Vision A. Bobick. Motion and Optic Flow. Stereo Matching

Optimizing Monocular Cues for Depth Estimation from Indoor Images

Image Base Rendering: An Introduction

CSE4030 Introduction to Computer Graphics

Morphable 3D-Mosaics: a Hybrid Framework for Photorealistic Walkthroughs of Large Natural Environments

2 Algorithm Description Active contours are initialized using the output of the SUSAN edge detector [10]. Edge runs that contain a reasonable number (

Sampling, Aliasing, & Mipmaps

But, vision technology falls short. and so does graphics. Image Based Rendering. Ray. Constant radiance. time is fixed. 3D position 2D direction

Project 3 code & artifact due Tuesday Final project proposals due noon Wed (by ) Readings Szeliski, Chapter 10 (through 10.5)

SURVEY OF LOCAL AND GLOBAL OPTICAL FLOW WITH COARSE TO FINE METHOD

Image-based rendering using plane-sweeping modelisation

Chapter IV Fragment Processing and Output Merging. 3D Graphics for Game Programming

Stereo. 11/02/2012 CS129, Brown James Hays. Slides by Kristen Grauman

Fast Guided Global Interpolation for Depth and. Yu Li, Dongbo Min, Minh N. Do, Jiangbo Lu

Dense 3D Reconstruction. Christiano Gava

EECS 442 Computer vision. Stereo systems. Stereo vision Rectification Correspondence problem Active stereo vision systems

A MULTI-RESOLUTION APPROACH TO DEPTH FIELD ESTIMATION IN DENSE IMAGE ARRAYS F. Battisti, M. Brizzi, M. Carli, A. Neri

Manifold Preserving Edit Propagation

Projective Texture Mapping with Full Panorama

Project 2 due today Project 3 out today. Readings Szeliski, Chapter 10 (through 10.5)

Lecture 14: Computer Vision

Scalable multi-gpu cloud raytracing with OpenGL

Real-Time Disparity Map Computation Based On Disparity Space Image

Schedule. Tuesday, May 10: Thursday, May 12: Motion microscopy, separating shading and paint min. student project presentations, projects due.

FLY THROUGH VIEW VIDEO GENERATION OF SOCCER SCENE

FAST: A Framework to Accelerate Super- Resolution Processing on Compressed Videos

Volume Shadows Tutorial Nuclear / the Lab

CS451Real-time Rendering Pipeline

SIFT - scale-invariant feature transform Konrad Schindler

Chaplin, Modern Times, 1936

What have we leaned so far?

Synthesis of Textures with Intricate Geometries using BTF and Large Number of Textured Micropolygons. Abstract. 2. Related studies. 1.

Problem Set 7 CMSC 426 Assigned Tuesday April 27, Due Tuesday, May 11

Depthmap based Stereoscopic Panorama Viewer for Head-Mounted Displays

Shaders. Slide credit to Prof. Zwicker

The Benefits of GPU Compute on ARM Mali GPUs

Matching. Compare region of image to region of image. Today, simplest kind of matching. Intensities similar.

Lightcuts. Jeff Hui. Advanced Computer Graphics Rensselaer Polytechnic Institute

Dense Image-based Motion Estimation Algorithms & Optical Flow

Computer Graphics Shadow Algorithms

Lecture 15: Image-Based Rendering and the Light Field. Kayvon Fatahalian CMU : Graphics and Imaging Architectures (Fall 2011)

Recap from Previous Lecture

Lecture 7: Image Morphing. Idea #2: Align, then cross-disolve. Dog Averaging. Averaging vectors. Idea #1: Cross-Dissolving / Cross-fading

Today. Stereo (two view) reconstruction. Multiview geometry. Today. Multiview geometry. Computational Photography

Three-dimensional reconstruction of binocular stereo vision based on improved SURF algorithm and KD-Tree matching algorithm

Mali Demos: Behind the Pixels. Stacy Smith

A virtual tour of free viewpoint rendering

Static Scene Reconstruction

Anisotropic representations for superresolution of hyperspectral data

Disguised Face Identification (DFI) with Facial KeyPoints using Spatial Fusion Convolutional Network. Nathan Sun CIS601

Local Feature Detectors

Improved depth map estimation in Stereo Vision

Passive 3D Photography

Real-Time Video-Based Rendering from Multiple Cameras

Image-based modeling (IBM) and image-based rendering (IBR)

Dense 3D Reconstruction. Christiano Gava

A Review of Image- based Rendering Techniques Nisha 1, Vijaya Goel 2 1 Department of computer science, University of Delhi, Delhi, India

High Performance Video Artifact Detection Enhanced with CUDA. Atul Ravindran Digimetrics

Fast Image Segmentation and Smoothing Using Commodity Graphics Hardware

Lecture 9: Deferred Shading. Visual Computing Systems CMU , Fall 2013

Image-Based Modeling and Rendering. Image-Based Modeling and Rendering. Final projects IBMR. What we have learnt so far. What IBMR is about

Final project bits and pieces

Postprint.

Photo Tourism: Exploring Photo Collections in 3D

Hardware Displacement Mapping

Lecture 10 Multi-view Stereo (3D Dense Reconstruction) Davide Scaramuzza

Transcription:

Real-Time Virtual Viewpoint Generation on the GPU for Scene Navigation May 31st 2010 Presenter: Robert Laganière CoAuthor: Shanat Kolhatkar Contributions Our work brings forth 3 main contributions: The calculation of new viewpoints in real-time using a view morphing algorithm The ability to navigate the environment in realtime through the use of GPU programming Enhancing the optical flow to reduce the number of noticeable artifacts 1

Previous Works Optical Flow calculation: Ogale and Aloimonos Algorithm Handles occlusions Contrast invariant matching function Indentification of slanted surfaces Implementation available for free at: http://www.cs.umd.edu/ogale/download/code.html Previous Works Texture Morphing [2] Linear color interpolation and motion compensation to generate the composite texture Doesn't depending much on user input Works with a wide variety of textures Produces high quality results Needs to have features of similar sizes in both origin and target textures. Cannot morph smoothly between highly different textures. 2

Previous Works Termes: ci, wj : Color and distance weights Ii: Color map of texture i Wij: Displacement map from texture i to texture j Virtual Navigation Software Using panoramas: Google Street View EveryScape Using 3D reconstruction Bing Google Earth Feature VillesEn3D 3

Bing VillesEn3D 4

Google Street View A Captured Panorama Google Street View A Transition Image example 5

Part 1: Enhancing the Optical Flow Part 1: The Extended Cube An example of the previously used cubic panorama representation The linked image cannot be displayed. The file may have been moved, renamed, or deleted. Verify that the link points to the correct file and location. 6

Part 1: The Extended Cube Description of the extended cube Each face is extended a certain number of pixels to each side during generation Goal: Correctly estimate the displacement of a visual point moving from one face to another. (small displacements) Part 1: The Extended Cube Extended Cube example 7

Part 1: The Extended Cube 3D View of the Extended Cube Part 1: The Extended Cube Boundary Handling The linked image cannot be displayed. The file may have been moved, renamed, or deleted. Verify that the link points to the correct file and location. Some displacement vectors are computed twice because they appear on each extended portion of two adjacent faces We check which of these two solutions gives the best result by comparing the color neighborhood in both images according to the L2-norm The one with the greatest distance is then replaced by the other one 8

Interpolated image using a cube Interpolated image using our extended cube Part 1: Smoothing The Flow Field We assume that the image is static. neighboring flow vectors should have a similar orientation 9

Part 1: Smoothing The Flow Field Original Destination Smoothed Uncorrected Part 2: View Interpolation 10

Part 2: View Interpolation Calculate each extended face of the cube independently from the others We use the previously calculated dense flow field to interpolate the position of the pixels We want the color and the motion to be interpolated jointly thus, our interpolation is written as: Part 2: View Interpolation 11

Part 3: Real-Time Navigation Part 3: Real-Time Navigation Four Components are necessary to achieve Interactive Navigation Pre-calculation of the flow Multi-Threading Buffering GPU Implementation of the interpolation 12

Part 3: Multi-Threading Use 3 Threads: One thread to handle the graphics calls (in our architecture, the OpenGL calls) One to load the data from the hard drive One thread to calculate the interpolated images, which is actually run on the GPU Part 3: Buffering This graph is an example of a possible panorama setting. Let us suppose that our buffer size is 7. If the user is currently at the viewpoint P1, then the buffer is filled with the panoramas and displacement fields of the following viewpoints: P1, P0, P2, P5, P8, P3, P6. 13

Part 3: GPU Implementation Each pixel's interpolation is calculated independently from each other we just need the origin and target pixels color values and the flow vectors) This makes it a perfect algorithm for GPU implementation Part 3: GPU Implementation GPU Algorithm 14

Part 4: Results Part 4: Results On the CPU, we created 20 transition images in 3 seconds. With our GPU based interpolation scheme, we could generate up to 1000 images in 3 seconds, with identical quality. (We ran our tests on a Pentium M 1.7Ghz with a Radeon Mobility X700) The optical flow calculation and correction took up to an hour on a 320x320 image on an Athlon 64 X2 6000+/3GHz. We ran the optical flow calculation with an interval for both x and y of [-40; 40] and we set the size of all the neighborhoods in the correction pass to 11. 15

Part 4: Results Degradation of results with the increase of the distance between the viewpoints Step of 1 The linked image cannot be displayed. The file may have been moved, renamed, or deleted. Verify that the link points to the correct file and location. Step of 2 Step of 3 Step of 1 Step of 2 Step of 3 16

Part 4: Results Smoothing improvement Uncorrected Smoothed Shades of Gray RMS Origin Linear Uncorrected Smoothed Ground Truth Destination 17

Shades of Gray RMS Origin Destination Linear Uncorrected Smoothed RMS Values 18

Conclusion New way of interpolating between pairs of panoramas in real-time using the GPU, Navigate inside a scene and achieve a high degree of realism. Main contribution: interpolation of intermediate viewpoints of a scene in realtime using computed optical flow field. Except for a few artifacts, our results are of good quality, as long as the panoramas were taken at reasonable distances from each other. References [1] A. S. Ogale and Y. Aloimonos. A roadmap to the integration of early visual modules. Int. Journal of Computer Vision.72:9 25, April 2007. [2] W. Matusik, M. Zwicker, and F. Durant. Texture design using a simplicial complex of morphable textures. SIGGRAPH, 4:124 125, 2005. 19