Projection simulator to support design development of spherical immersive display

Similar documents
Seamless Multi-Projector Display on Curved Screens

AP Physics: Curved Mirrors and Lenses

Omni-directional stereoscopy

Augmenting Reality with Projected Interactive Displays

Reflections. I feel pretty, oh so pretty

Light: Geometric Optics

Real-time Integral Photography Holographic Pyramid using a Game Engine

Image Formation and the Lens: Object Beyond The Focal Point

General Physics II. Mirrors & Lenses

Scalable multi-gpu cloud raytracing with OpenGL

Part Images Formed by Flat Mirrors. This Chapter. Phys. 281B Geometric Optics. Chapter 2 : Image Formation. Chapter 2: Image Formation

Light: Geometric Optics (Chapter 23)

IEEE Consumer Electronics Society Calibrating a VR Camera. Adam Rowell CTO, Lucid VR

PHY 171 Lecture 6 (January 18, 2012)

Light: Geometric Optics

Light & Optical Systems Reflection & Refraction. Notes

Camera Calibration for Video See-Through Head-Mounted Display. Abstract. 1.0 Introduction. Mike Bajura July 7, 1993

Natural Viewing 3D Display

Real-time Generation and Presentation of View-dependent Binocular Stereo Images Using a Sequence of Omnidirectional Images

Ray Optics. Physics 11. Sources of Light Rays: Self-Luminous Objects. The Ray Model of Light

Topics and things to know about them:

Efficient Depth-Compensated Interpolation for Full Parallax Displays

Graphics for VEs. Ruth Aylett

Remote Reality Demonstration

Room-sized Immersive Projection Display for Tele-immersion Environment

(a) (b) (c) Fig. 1. Omnidirectional camera: (a) principle; (b) physical construction; (c) captured. of a local vision system is more challenging than

Ray Tracing. CS334 Fall Daniel G. Aliaga Department of Computer Science Purdue University

Models and Architectures

CS 563 Advanced Topics in Computer Graphics Camera Models. by Kevin Kardian

CS 498 VR. Lecture 20-4/11/18. go.illinois.edu/vrlect20

CSE528 Computer Graphics: Theory, Algorithms, and Applications

Optics INTRODUCTION DISCUSSION OF PRINCIPLES. Reflection by a Plane Mirror

BCC Sphere Transition

PHYSICS. Chapter 34 Lecture FOR SCIENTISTS AND ENGINEERS A STRATEGIC APPROACH 4/E RANDALL D. KNIGHT

Recursion and Data Structures in Computer Graphics. Ray Tracing

Reflection and Refraction

Efficient Rendering of Glossy Reflection Using Graphics Hardware

ONE MARK QUESTIONS GEOMETRICAL OPTICS QUESTION BANK

Image-based Lighting

The 7d plenoptic function, indexing all light.

Algebra Based Physics

3B SCIENTIFIC PHYSICS

Models and Architectures. Ed Angel Professor of Computer Science, Electrical and Computer Engineering, and Media Arts University of New Mexico

Enabling immersive gaming experiences Intro to Ray Tracing

Computer Graphics Lecture 11

34.2: Two Types of Image

CPSC / Texture Mapping

Ray Optics Demonstration Set (RODS) and Ray Optics Demonstration Set Plus (RODS+) USER S GUIDE

Image-Based Lighting. Inserting Synthetic Objects

Ray Tracing through Viewing Portals

Chapter 23. Geometrical Optics: Mirrors and Lenses and other Instruments

Chapter 34: Geometrical Optics

UNIT C: LIGHT AND OPTICAL SYSTEMS

A New Image Based Ligthing Method: Practical Shadow-Based Light Reconstruction

Physics 1C Lecture 26A. Beginning of Chapter 26

03 RENDERING PART TWO

Image-Based Lighting

Introduction to Computer Graphics with WebGL

Image-Based Lighting. Eirik Holmøyvik. with a lot of slides donated by Paul Debevec

Optics II. Reflection and Mirrors

Science 8 Chapter 5 Section 1

Computer Graphics for Large- Scale Immersive Theaters

S U N G - E U I YO O N, K A I S T R E N D E R I N G F R E E LY A VA I L A B L E O N T H E I N T E R N E T

A 100Hz Real-time Sensing System of Textured Range Images

Lecture Notes (Reflection & Mirrors)

Texturing Theory. Overview. All it takes is for the rendered image to look right. -Jim Blinn 11/10/2018

Chapter 5 Mirrors and Lenses

Distributed Ray Tracing

The Reflection of Light

Lecture Notes (Geometric Optics)

Texture. Texture Mapping. Texture Mapping. CS 475 / CS 675 Computer Graphics. Lecture 11 : Texture

CS 475 / CS 675 Computer Graphics. Lecture 11 : Texture

Ray Tracing. Kjetil Babington

SIMULATION AND VISUALIZATION IN THE EDUCATION OF COHERENT OPTICS

Computer Graphics (CS 543) Lecture 13b Ray Tracing (Part 1) Prof Emmanuel Agu. Computer Science Dept. Worcester Polytechnic Institute (WPI)

Similarity Image Retrieval System Using Hierarchical Classification

Chapter 26 Geometrical Optics

Consider a partially transparent object that is illuminated with two lights, one visible from each side of the object. Start with a ray from the eye

Chapter 36. Image Formation

Generating Tool Paths for Free-Form Pocket Machining Using z-buffer-based Voronoi Diagrams

DESIGNING A SIMPLE OPTICAL SYSTEM IN LIGHTTOOLS

MARK. January 2019 issue... BENCH THE INTERNATIONAL MAGAZINE FOR ENGINEERING DESIGNERS & ANALYSTS FROM NAFEMS

Victor S. Grinberg Gregg W. Podnar M. W. Siegel

MR-Mirror: A Complex of Real and Virtual Mirrors

Virtual Photometric Environment using Projector

Optics and Images. Lenses and Mirrors. Matthew W. Milligan

Real-time Image-based Reconstruction of Pipes Using Omnidirectional Cameras

Chapter 23. Light Geometric Optics

Live Video Integration for High Presence Virtual World

Image Formation. Antonino Furnari. Image Processing Lab Dipartimento di Matematica e Informatica Università degli Studi di Catania

Designing a Self-Calibrating Pipeline for Projection Mapping Application. Kevin Wright Kevin Moule

Principles of Computer Graphics. Lecture 3 1

Image-Based Lighting : Computational Photography Alexei Efros, CMU, Fall Eirik Holmøyvik. with a lot of slides donated by Paul Debevec

LIGHT. Speed of light Law of Reflection Refraction Snell s Law Mirrors Lenses

A SXGA 3D Display Processor with Reduced Rendering Data and Enhanced Precision

A Simple Method of the TEX Surface Drawing Suitable for Teaching Materials with the Aid of CAS

Chapter 23. Geometrical Optics (lecture 1: mirrors) Dr. Armen Kocharian

Announcements. Written Assignment 2 out (due March 8) Computer Graphics

The Law of Reflection

Measurement of Pedestrian Groups Using Subtraction Stereo

Transcription:

Projection simulator to support design development of spherical immersive display Wataru Hashimoto, Yasuharu Mizutani, and Satoshi Nishiguchi Faculty of Information Sciences and Technology, Osaka Institute of Technology 1-79-1 Kitayama, Hirakata City, Osaka, 573-0196, Japan {wataru.hashimoto, yasuharu.mizutani, satoshi.nishiguchi}@oit.ac.jp Abstract. This research aims to develop a simulator that supports the construction of a spherical immersive display, which is a system that can provide a realistic presence, as if the user exists in another space. In general, when developing a display, it is necessary to perform optical design of the projection system in considering special distortion correction on the dome screen. However, accuracy of the optical system that is actually manufactured is not guaranteed to be when it is simulated, and fine adjustment is again necessary when the display is used. In this research, we report on the development of a projection simulator that can perform optical system adjustment and distortion correction simultaneously during optical design of the projection system. Keywords: Spatially immersive display, optical design, image registration, distortion correction 1 Introduction Immersive projection display is one of the most powerful systems for enhancing spatial presence and experienced realism. Cave-like immersive displays are increasingly being used for immersive Virtual Reality(VR) applications. A major problem with immersive displays is that they cannot be installed easily, due to the large amount of space required by the surrounding screens. In order to consider the space utility and reduce the volume, a fisheye lens and convex mirror, for example, are utilized to shorten the projection distance. An ultra-short focus projector is also an appropriate device for decreasing spatial cost. It is necessary to simulate the optical system of the projection display; however, optical simulation with certain special lenses is too complex to be easily implemented. Since the optical system is delicate in terms of installation precision, the designed projection cannot always be obtained. After assembling several pieces of display equipment, fine adjustments on the screen are necessary to correct the error due to the optical system for accurate projection. Furthermore, it is necessary to apply the distortion of the image as a result of the optical system to the display content. The purpose of this research is to achieve design of an immersive display without difficulty, and to develop an environment that can correct error due to the optical system. We focus on the spherical immersive display (Fig. 1), which can be realized on an adfa, p. 1, 2011. Springer-Verlag Berlin Heidelberg 2011

average room size scale among immersive displays, and organize the spherical immersive display s development flow to extract problems that are obstacles to its development. Moreover, we aim to develop a simulator that can build a device without requiring special knowledge or experience. Fig. 1. Spherical immersive display 2 Spherical Immersive Display Features A planetarium is one of the most similar systems to a spherical immersive display. Since the projection technology of the planetarium was established, many technical resources [1] have been discussed for projecting onto the half-spherical dome. Recently, there have been several examples of the DIY planetarium, due to the widespread use of projectors or other such optical devices. Although the planetarium can be classified as a spherical immersive display, the display utilized in the VR field has the following features. Interactivity The user can interact with the image; therefore, the projected image is not pre-rendered like those in movies, but must be generated immediately. Rear projection Rear projection is suitable for surrounding the user completely with a screen, and can also prevent the user from casting a shadow on the screen. Viewpoint position dependency As the curvature radius of the spherical screen is shortened, the user begins to feel the distortion of the projected image on the screen s surface; therefore, the viewpoint position dependency results in restriction of the user's movement. Based on the above features, we organize the development flow of the spherical immersive display.

3 Spherical Display Development Flow The development flow classification of the spherical display can be roughly divided into the following four areas. 1. Optical design Optical simulation is necessary in order to project the image onto the spherical surface correctly. The simulation determines, among others, the size, shape, and placement of the screen, as well as the direction and focus distance of the projector. Points to consider in optical design include whether the projected image is focused on the screen, the projector light is blocked by the user or structure, and the projector pixels are used effectively. It is also necessary to focus on the seam of the projected image when multiple projectors are utilized simultaneously. Since the optical simulation can also predict the distortion of the projected image onto a curved screen, a corresponding table for distortion compensation, utilized in flow 4, is generated through the simulation. 2. Spherical screen prototype A spherical screen and its optical system are built based on the results of flow 1. It is extremely difficult to form a spherical screen with precision. In general, FRP molding or a scraping of polystyrene mass is adopted to form a large curvature radius for the spherical screen. 3. Adjustment The prototype display optical system usually contains undesirable manufacturing error. Since the optical system is delicate in terms of error, it is impossible to obtain an ideal design result. Fig. 2 shows the grid (latitude/longitude) computed by the optical design described in flow 1. The grid in the left-hand image is clearly shifted from the ideal position shown in right-hand picture, due to manufacturing error in the optical system. Therefore, it is necessary to physically move the screen or projector and adjust the grid position on the screen in order to approach the ideal state shown on the right. Several studies (for example, [3]) use external cameras to observe the projection state for automatically adjusting the grid position. Fig. 2. Result of fine alignments for domed screen, shown on the right. Manufacturing error causes some misalignments, as shown on the left

4. Visual content creation In order to project visual content onto the spherical surface accurately, a pre-distorted image must be generated, for which the two-pass rendering method is a suitable approach. The rendering requires the use of a corresponding table, obtained by the adjustment described in flow 3. A rendering method using a game engine such as Unity has also been proposed [4]. The adjustment mentioned in flow 3 is an essential process, due to the error involved in the manufacturing and assembly of the optical system described in flow 1. As the number of points in the adjustment grid increases, the distortion correction accuracy increases. However, adjustment becomes troublesome with a very large number of points. In this research, we propose a simulator that can perform optical design and adjustment simultaneously, so as to reduce the cost of adjustment while maintaining sufficiently high distortion correction accuracy. In particular, we aim to fine-adjust all parameters related to placement by confirming the distortion correction result. This involves estimating the actual parameters, including manufacturing error, from the state of the distortion correction that is actually displayed. If the all parameters are precisely estimated, it is easy to increase the number of points in the distortion correction grid. 4 Simulator Design The projection of omnidirectional images can take several formats: the dome master format can cover the projection range of the fisheye lens; the equirectangular projection format extends a 360-degree image to a rectangle, as with a world map; and the cubemap format covers all user directions with a cube, as shown in Fig. 3. We opted to use the cube-map, which can also be applied to the resulting display during the adjustment process. With this approach, it is only necessary to render images in all six directions and map their textures onto the spherical shape. Fig. 3. Omnidirectional image rendering with cube-map To realize the cube-map rendering, distortion correction is performed based on the relationship between the light ray PPPP of the projector and the projection point S, as shown in Fig.4. Point S, which is located on the spherical screen, is the extension of a straight line from the ideal viewpoint V to the feature point M of the cube-map. In the case of Fig. 4, PPPP reaches point S after being reflected by the convex mirror C. Given

the convex mirror s position, orientation, and curvature, S can be obtained from the reflection from P at C. However, although S and P are already known, it is difficult to determine the intersection point C of PPPP ; that is, it is easy to follow the projector s ray in the forward direction, but difficult to calculate the inverse. Fig. 4. Calculated points on projector-projected image plane, corresponding to feature points of cube-map The correspondence between the original plane of projection (I) and the feature points (M) of a cube-map positioned on the ideal viewpoint is required for distortion correction. If the correspondence cannot be determined by the inverse calculation, another method must be considered. We decided to calculate all valid pixels displayed on the projection s screen surface in the forward direction, such as PPPP C S. Then, the nearest point S is searched using the intersection point of the line VM and the screen surface, to obtain the relationship between M and I. In order to reduce the number of searches for the nearest point, we applied the hill-climbing method as a search algorithm. Since all of these ray trace calculations are independent of another, this method is easy to apply to parallel computation. 5 Simulator Implementation Fig. 5 provides an overview of the developed projection simulator. Several parameters related to the screen and projector configuration can be changed in the upper-right windows. The result of the ray tracing process is reflected immediately in the left window. The result of the distortion correction process is rendered in the lower-right window as the feature point of the cube-map. Projecting the lower-right window onto the full screen enables us to confirm the actual projection condition while changing parameters.

Fig. 5. Overview of projection simulator. The left window indicates the result of projecting a normal image onto the spherical screen. The lower-right window is the distortion corrected image. If the image is projected onto the screen, the viewer can see properly We applied this projection simulator to two spherical screen environments. Fig. 6 shows a hemispherical screen with a diameter of 1800mm, projecting by means of diffuse reflection with a convex mirror. Fig. 5 shows the result obtained by fine adjustment of the parameters through observing the distortion correction. The result of projecting the distortion correction image onto the screen is shown in Fig. 7. The distortion correction process requires a maximum of 0.21sec to complete the calculation in system equipped with an Intel CPU (Core i7-4710mq, 16GB main memory) when the total number of feature points is 121. If a full search algorithm is applied instead of the hill-climbing method, the process requires 7.05sec to complete the calculation. This means that the projection simulator using the hill-climbing method can follow the results in almost real time, and the user can verify the pixel distribution in the design or adjust the alignment at any time. Fig. 6. Projection simulator applied to environment 1: This environment consists of a hemispherical screen of 1800 mm in diameter, projector, and convex mirror with a curvature radius of 259 mm

Fig. 7. Result of distortion correction in environment 1 (captured approximately 2m behind the ideal viewpoint) The second environment, shown in Fig. 8, consists of an acrylic dome screen with a diameter of 1000mm and an ultra-short focus projector behind the screen. Fig. 9 shows the distortion correction result of the cube-map on the actual screen. It can be confirmed that the left and right faces of the cube-map are displayed without distortion. The total time required for the distortion correction process was 6.8 sec using a full search, 2.78sec using the optimized all-point search by OpenMP, and 0.17sec using a combination of the hill-climbing method and OpenMP. Fig. 8. Projection simulator applied to environment 2: This environment consists of a spherical screen of 1000 mm in diameter and an ultra-short focal projector behind the screen. Fig. 9. Result of distortion correction in environment 2 (captured approximately 1m behind the ideal viewpoint)

6 Conclusion We have developed a projection simulator that can simultaneously perform optical design and adjustment of a spherical display. This simulator enables us to estimate substantive parameters related to the optical system by observing the projection state. The obtained parameters aid in increasing the number of feature points of the cube-map, for high distortion correction accuracy. The obtained distortion correction result is converted to the OBJ format, so that visual contents can be easily created with a game engine such as Unity. The current simulator corresponds only to parametric shapes; therefore, in future work we will aim for our simulator to be able to support non-parametric shapes captured by devices such as a depth camera. In addition, further improvements in the computation time are expected when GPGPU is applied to the simulator. Acknowledgments This work was supported by JSPS KAKENHI, Grant Number 15K00290. We also appreciate Prof. Corchado and Dr. De la Prieta at the University of Salamanca for assisting with this research. References 1. IMERSA,2017,http://www.imersa.org/ 2. International Planetarium Society, 2017, http://www.ips-planetarium.org/ 3. Ramesh Raskar, Jeroen van Baar, Thomas Willwacher, Srinivas Rao: Quadric Transfer for Immersive Curved Screen Displays, Eurographics, Vol. 23, No. 3, 2004 4. Paul Bourke: Dome: Immersive gaming with the Unity game engine, Proc. of CGAT09, pp. 265-272, 2009