Omni-directional stereoscopy

Similar documents
CS 563 Advanced Topics in Computer Graphics Stereoscopy. by Sam Song

Computer Graphics for Large- Scale Immersive Theaters

3D Stereo Visualization

Projection simulator to support design development of spherical immersive display

Stereoscopy: Theory and Practice

Image-Based Lighting. Inserting Synthetic Objects

Image-Based Lighting

Image-Based Lighting. Eirik Holmøyvik. with a lot of slides donated by Paul Debevec

Image-Based Lighting : Computational Photography Alexei Efros, CMU, Fall Eirik Holmøyvik. with a lot of slides donated by Paul Debevec

Multi-View Omni-Directional Imaging

The 7d plenoptic function, indexing all light.

CS 354R: Computer Game Technology

Immersive Rendering Basics and Aesthetics

Realtime 3D Computer Graphics Virtual Reality

A Conceptual and Practical Look into Spherical Curvilinear Projection By Danny Oros

SUMMARY. CS380: Introduction to Computer Graphics Projection Chapter 10. Min H. Kim KAIST School of Computing 18/04/12. Smooth Interpolation

CS451Real-time Rendering Pipeline

Specialists in CG large format immersive experiences. At-Bristol Data Dome Workshop - 22 May Paul Mowbray - Head of NSC Creative

Computer Graphics for Large- Scale Immersive Theaters

Multidimensional image retargeting

COMP environment mapping Mar. 12, r = 2n(n v) v

Animation Susen Rabold. Render. Engineers

Stereoscopic Systems Part 1

Friday, 22 August 14. Lenticular prints

CSE 167: Introduction to Computer Graphics Lecture #7: Lights. Jürgen P. Schulze, Ph.D. University of California, San Diego Spring Quarter 2015

Texture. Texture Mapping. Texture Mapping. CS 475 / CS 675 Computer Graphics. Lecture 11 : Texture

CS 475 / CS 675 Computer Graphics. Lecture 11 : Texture

Real-time Generation and Presentation of View-dependent Binocular Stereo Images Using a Sequence of Omnidirectional Images

Omni-directional Stereoscopic Fisheye Images for Immersive Hemispherical Dome Environments

03 RENDERING PART TWO

Image stitching. Digital Visual Effects Yung-Yu Chuang. with slides by Richard Szeliski, Steve Seitz, Matthew Brown and Vaclav Hlavac

Scalable multi-gpu cloud raytracing with OpenGL

CSE 165: 3D User Interaction. Lecture #3: Displays

Prof. Feng Liu. Spring /27/2014

Announcements. Written Assignment 2 out (due March 8) Computer Graphics

Omni Stereo Vision of Cooperative Mobile Robots

Realtime Dome Imaging and Interaction: Towards Immersive Design Environments

Visual Rendering for VR. Stereo Graphics

Visualization Insider A Little Background Information

A New Image Based Ligthing Method: Practical Shadow-Based Light Reconstruction

CSE 165: 3D User Interaction

The Graphics Pipeline and OpenGL IV: Stereo Rendering, Depth of Field Rendering, Multi-pass Rendering!

Camera Calibration for a Robust Omni-directional Photogrammetry System

Viewing. Part II (The Synthetic Camera) CS123 INTRODUCTION TO COMPUTER GRAPHICS. Andries van Dam 10/10/2017 1/31

Overview. By end of the week:

Transforms 3: Projection Christian Miller CS Fall 2011

Real-time Integral Photography Holographic Pyramid using a Game Engine

Perspective Shadow Maps. Shadow Map Aliasing. in particular for large scenes shadow maps impossible for infinite scenes

Ray Tracing. CS334 Fall Daniel G. Aliaga Department of Computer Science Purdue University

Stereographics. Mike Bailey. Stereovision is not new It s been in common use in the movies since the 1950s

Stereographics. Mike Bailey. Computer Graphics stereographics.pptx

Stereographics. Stereovision is not new It s been in common use in the movies since the 1950s. And, in stills, even longer than that.

A Review of Image- based Rendering Techniques Nisha 1, Vijaya Goel 2 1 Department of computer science, University of Delhi, Delhi, India

Announcements. Submitting Programs Upload source and executable(s) (Windows or Mac) to digital dropbox on Blackboard

Robert Collins CSE486, Penn State Lecture 08: Introduction to Stereo

Devices displaying 3D image. RNDr. Róbert Bohdal, PhD.

Stereo. Shadows: Occlusions: 3D (Depth) from 2D. Depth Cues. Viewing Stereo Stereograms Autostereograms Depth from Stereo

3D Viewing. CMPT 361 Introduction to Computer Graphics Torsten Möller. Machiraju/Zhang/Möller

CSE 167: Introduction to Computer Graphics Lecture #6: Lights. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2014

Mahdi Amiri. May Sharif University of Technology

lecture 18 - ray tracing - environment mapping - refraction

CMSC427 Advanced shading getting global illumination by local methods. Credit: slides Prof. Zwicker

Orthogonal Projection Matrices. Angel and Shreiner: Interactive Computer Graphics 7E Addison-Wesley 2015

Image Warping and Mosacing

Smoothing Region Boundaries in Variable Depth Mapping for Real Time Stereoscopic Images

IEEE Consumer Electronics Society Calibrating a VR Camera. Adam Rowell CTO, Lucid VR

Robust Stencil Shadow Volumes. CEDEC 2001 Tokyo, Japan

Image Based Lighting with Near Light Sources

Image Based Lighting with Near Light Sources

Today. Rendering pipeline. Rendering pipeline. Object vs. Image order. Rendering engine Rendering engine (jtrt) Computergrafik. Rendering pipeline

zspace Developer SDK Guide - Introduction Version 1.0 Rev 1.0

High-quality Shadows with Improved Paraboloid Mapping

Three-Dimensional Viewing Hearn & Baker Chapter 7

Graphics for VEs. Ruth Aylett

Image-based Lighting

Natural Viewing 3D Display

CS 498 VR. Lecture 20-4/11/18. go.illinois.edu/vrlect20

CS 381 Computer Graphics, Fall 2012 Midterm Exam Solutions. The Midterm Exam was given in class on Tuesday, October 16, 2012.

lecture 10 - depth from blur, binocular stereo

Introduction to 3D Machine Vision

Complex Features on a Surface. CITS4241 Visualisation Lectures 22 & 23. Texture mapping techniques. Texture mapping techniques

3ds Max certification prep

Digital Imaging Study Questions Chapter 8 /100 Total Points Homework Grade

Models and Architectures

Three-dimensional directional display and pickup

Interaxial Distance and Convergence Control for Efficient Stereoscopic Shooting using Horizontal Moving 3D Camera Rig

Computer Viewing. CS 537 Interactive Computer Graphics Prof. David E. Breen Department of Computer Science

Viewing with Computers (OpenGL)

Volumetric Hyper Reality: A Computer Graphics Holy Grail for the 21st Century? Gavin Miller Apple Computer, Inc.

ICG Real-Time Stereo Dome Projection System

Chapter 7: Geometrical Optics. The branch of physics which studies the properties of light using the ray model of light.

Designing a Self-Calibrating Pipeline for Projection Mapping Application. Kevin Wright Kevin Moule

Rasterization Overview

Draw Guide. Chapter 7 Working with 3D Objects

Computer Graphics. Si Lu. Fall uter_graphics.htm 11/22/2017

Dual-Paraboloid Shadow Mapping. A paper by Stefan Brabec, Thomas Annen, Hans-Peter Seidel Presented By Patrick Wouterse, Selmar Kok

CSE528 Computer Graphics: Theory, Algorithms, and Applications

Constrained Diffusion Limited Aggregation in 3 Dimensions

Simulating a 3D Environment on a 2D Display via Face Tracking

Transcription:

Omni-directional stereoscopy Paul Bourke (WASP, UWA) Motivation Correct stereoscopic views require a matching relationship between viewing geometry and rendering geometry. That is, viewer position/relationship to the display is identical to the virtual cameras position/relationship to the projection plane. Implications that apply to both flat and surround screen environments. - Correct depth and scale relationships only correct for a single viewing position, and hence single viewer. - For a non-stationary viewer, head tracking is required to maintain the correct frustums. To what extent can these be relaxed for surround stereoscopic projection environments, those intended for an audience.

Example illustrating depth distortion In a non-head tracked environment it is easy to experience the distortions by moving left/right and forward/backward. Forward/back results in compression/dilation of space. Side to side movement results in shearing of space. Easy to see why this occurs. Consider the square on the far right in the below image and its projection onto the left and right projection plane. For a different viewer position the apparent depth of the cube changes. => Movement forward and backward results in a compression and stretching of space. Example illustrating multiple observers A similar but more serious problem occurs for stereoscopic environments that surround a number of observers or even a single user. For a flat display the multiple observers receive increasingly distorted views as their distance from the correct spot increases. In a surround stereoscopic projection space they can receive a totally incorrect view, with zero or inverted stereoscopic image pairs. For example, in a cylindrical stereoscopic display an observer looking forward needs to receive totally different parallax information compared to an observer looking to the right. There can only be one image on the display so how can multiple observers be supported? Even a single observer needs different stereo pairs as they look in different directions, even though they may not move.

Question? Can one present strictly correct stereoscopic content to a number of observers within a large surround projection environment without distortion? Answer: no! Can one present stereoscopic content to a number of observers within either a large surround projection environment with an acceptable (unnoticed) level of distortion? Answer: yes! Will present the following cases that have been either installed or tested. 1) 10m diameter cylindrical environment, first instance at icinema, UNSW. This easily supports a dozen participants each optionally looking in different directions. 2) Hemispherical dome, both large (multiple observers) and small (single person). AVIE, icinema, UNSW idome, WASP, UWA Stereoscopic panoramic images Capturing stereoscopic panoramic images employs a slit camera pair separated by the interocular distance. The camera pair rotate about their combined center. For continuous rendering no stitching errors. For a finite slit width of rendered material or for photography the stitching issues reduce as the slit width reduces. The stitching issues also reduce with distance. There are continuous rotating slit cameras, such as the Roundshot, multipass and vertex shader algorithms for OpenGL. Roundshot camera Raytraced stereoscopic cylindrical projection created for PovRay Workflow developed for 3DStudioMax

Results Within a narrow region directly in front of an observer the stereo pairs are correct. As one considers the image left and right from that position the error increases. Saved by the limited FOV of the glasses! We don t experience stereo in our peripheral vision. Upshot is that multiple observers can be located within the cylinder and all see an acceptable stereo image irrespective of where they are looking. The distortions as one moves away from the center of the cylinder still exist but interestingly seem less noticeable compared to the equivalent flat screen shearing effect. More distant Distant Almost zero parallax Left Right Images courtesy Peter Morse Stereoscopic fisheye images Cast rays through each pixel (or subpixel) in fisheye space from each eye. Simulate head (and therefore eye) position as one pivots ones head about in the dome. Easy in a raytracer to arbitrarily define a initial vector for each point on the fisheye projection plane for each eye. Still image pairs can be generated by capturing stereo spherical projections. Currently implemented for CG and still photography. No clear solution for filmed material.

Example Consistent parallax across the fisheye Raytraced with PovRay using a custom camera type. Redentore model courtesy Nathan O Brien Example: idome Stereo fisheye pairs derived from stereoscopic spherical images by Peter Murphy

Example: idome Stereoscopic fisheyes calculated from stereoscopic spherical images, courtesy Sarah Kenderdine. Concluding remarks Cylindrical stereoscopic movies are well established and working at icinema, as well as other AVIE installations. Includes realtime, photographic, video, and CG content. Stereoscopic domes (two at the time of writing) are being deployed by planetarium installers, to-date they are not creating optimal stereosopic fisheye pairs as discussed here, but rather simply parallel fisheye projections with the expected errors as one looks away from the central direction. Successful tests have been conducted in the idome using Christie 120Hz frame doubling stereo-capable DLP projector. (Further tests using anaglyph pairs.) Main problem at the moment is the unsuitability of the currently available lens from Christie.