Real-Time Image Based Lighting in Software Using HDR Panoramas

Similar documents
Image Based Lighting with Near Light Sources

Image Based Lighting with Near Light Sources

Research White Paper WHP 143. Multi-camera radiometric surface modelling for image-based re-lighting BRITISH BROADCASTING CORPORATION.

x ~ Hemispheric Lighting

CMSC427 Advanced shading getting global illumination by local methods. Credit: slides Prof. Zwicker

Rendering Synthetic Objects into Real Scenes. based on [Debevec98]

The Animation Process. Lighting: Illusions of Illumination

Consider a partially transparent object that is illuminated with two lights, one visible from each side of the object. Start with a ray from the eye

Introduction. Chapter Computer Graphics

IMAGE BASED RENDERING: Using High Dynamic Range Photographs to Light Architectural Scenes

Image-based Lighting

A New Image Based Ligthing Method: Practical Shadow-Based Light Reconstruction

Lighting & 3D Graphics. Images from 3D Creative Magazine

Image-Based Lighting : Computational Photography Alexei Efros, CMU, Fall Eirik Holmøyvik. with a lot of slides donated by Paul Debevec

Evolution of Imaging Technology in Computer Graphics. Related Areas

High Dynamic Range Image Texture Mapping based on VRML

Rendering Hair-Like Objects with Indirect Illumination

Computergrafik. Matthias Zwicker Universität Bern Herbst 2016

Point based global illumination is now a standard tool for film quality renderers. Since it started out as a real time technique it is only natural

Accelerated Ambient Occlusion Using Spatial Subdivision Structures

Computer Graphics Lecture 11

Render methods, Compositing, Post-process and NPR in NX Render

Real Time Rendering. CS 563 Advanced Topics in Computer Graphics. Songxiang Gu Jan, 31, 2005

The Shading Probe: Fast Appearance Acquisition for Mobile AR

Image-Based Lighting. Inserting Synthetic Objects

Introduction to 3D Concepts

Chapter 1 Introduction

Photon Mapping. Due: 3/24/05, 11:59 PM

Global Illumination. CMPT 361 Introduction to Computer Graphics Torsten Möller. Machiraju/Zhang/Möller

Image-Based Lighting

Image-Based Lighting. Eirik Holmøyvik. with a lot of slides donated by Paul Debevec

GUERRILLA DEVELOP CONFERENCE JULY 07 BRIGHTON

Module 5: Video Modeling Lecture 28: Illumination model. The Lecture Contains: Diffuse and Specular Reflection. Objectives_template

EFFICIENT REPRESENTATION OF LIGHTING PATTERNS FOR IMAGE-BASED RELIGHTING

Image-based Lighting (Part 2)

Photorealistic Augmented Reality

Published in: Proceedings: International Conference on Graphics Theory and Applications, Setubal, Portugal

Classification of Illumination Methods for Mixed Reality

Bringing Hollywood to Real Time. Abe Wiley 3D Artist 3-D Application Research Group

Normal Maps and Cube Maps. What are they and what do they mean?

Fog and Cloud Effects. Karl Smeltzer Alice Cao John Comstock

Computer Graphics. Lecture 14 Bump-mapping, Global Illumination (1)

Global Illumination CS334. Daniel G. Aliaga Department of Computer Science Purdue University

The 7d plenoptic function, indexing all light.

Shadow and Environment Maps

COMP 4801 Final Year Project. Ray Tracing for Computer Graphics. Final Project Report FYP Runjing Liu. Advised by. Dr. L.Y.

Image-Based Deformation of Objects in Real Scenes

Real-time High Dynamic Range Image-based Lighting

COM337 COMPUTER GRAPHICS Other Topics

Sources, shadows and shading

Global Illumination. COMP 575/770 Spring 2013

Rendering Algorithms: Real-time indirect illumination. Spring 2010 Matthias Zwicker

Computer graphics and visualization

Lighting. Figure 10.1

03 RENDERING PART TWO

From Graphics to Visualization

Faces and Image-Based Lighting

Graphics for VEs. Ruth Aylett

Chapter 7. Conclusions and Future Work

Soft shadows. Steve Marschner Cornell University CS 569 Spring 2008, 21 February

Texture Mapping. Images from 3D Creative Magazine

Advanced Computer Graphics CS 563: Making Imperfect Shadow Maps View Adaptive. Frederik Clinck lie

GLOBAL ILLUMINATION. Christopher Peters INTRODUCTION TO COMPUTER GRAPHICS AND INTERACTION

COSC 448: REAL-TIME INDIRECT ILLUMINATION

Synthesising Radiance Maps from Legacy Outdoor Photographs for Real-time IBL on HMDs

Dynamic Ambient Occlusion and Indirect Lighting. Michael Bunnell NVIDIA Corporation

Self-Shadowing of ynamic cenes with nvironment aps using the GPU

Capturing and Rendering With Incident Light Fields

Statistical Acceleration for Animated Global Illumination

Subtractive Shadows: A Flexible Framework for Shadow Level of Detail

CS635 Spring Department of Computer Science Purdue University

SAMPLING AND NOISE. Increasing the number of samples per pixel gives an anti-aliased image which better represents the actual scene.

DEFERRED RENDERING STEFAN MÜLLER ARISONA, ETH ZURICH SMA/

A Brief Overview of. Global Illumination. Thomas Larsson, Afshin Ameri Mälardalen University

Full Screen Layout. Main Menu Property-specific Options. Object Tools ( t ) Outliner. Object Properties ( n ) Properties Buttons

Real-Time Rendering of a Scene With Many Pedestrians

Stochastic Path Tracing and Image-based lighting

View-Dependent Texture Mapping

NVIDIA nfinitefx Engine: Programmable Pixel Shaders

Light Field Techniques for Reflections and Refractions

Chapter 23- UV Texture Mapping

Advanced Graphics. Global Illumination. Alex Benton, University of Cambridge Supported in part by Google UK, Ltd

critical theory Computer Science

Recent Advances in Monte Carlo Offline Rendering

Computer Graphics. Lecture 13. Global Illumination 1: Ray Tracing and Radiosity. Taku Komura

Illumination Brush: Interactive Design of Image-based Lighting

Surface Detail Maps with Soft Self- Shadowing. Chris Green, VALVE

Efficient Rendering of Glossy Reflection Using Graphics Hardware

Render all data necessary into textures Process textures to calculate final image

Dynamic range Physically Based Rendering. High dynamic range imaging. HDR image capture Exposure time from 30 s to 1 ms in 1-stop increments.

CS : Assignment 2 Real-Time / Image-Based Rendering

Chapter 1 Introduction. Marc Olano

Ambien Occlusion. Lighting: Ambient Light Sources. Lighting: Ambient Light Sources. Summary

Scalable multi-gpu cloud raytracing with OpenGL

Interactive Radiosity Using Mipmapped Texture Hardware

The Terrain Rendering Pipeline. Stefan Roettger, Ingo Frick. VIS Group, University of Stuttgart. Massive Development, Mannheim

LIGHTING AND SHADING

Other Rendering Techniques CSE 872 Fall Intro You have seen Scanline converter (+z-buffer) Painter s algorithm Radiosity CSE 872 Fall

#Short presentation of the guys

lecture 18 - ray tracing - environment mapping - refraction

Transcription:

Real-Time Image Based Lighting in Software Using HDR Panoramas Jonas Unger, Magnus Wrenninge, Filip Wänström and Mark Ollila Norrköping Visualization and Interaction Studio Linköping University, Sweden Abstract We present a system allowing real-time image based lighting based on HDR panoramic images. The system performs time-consuming diffuse light calculations in a preprocessing step, which is key to attaining interactivity. The real-time subsystem processes an image based lighting model in software, which would be simple to implement in hardware. Rendering is handled by OpenGL, but could be substituted for another graphics API. Applications for the technique presented are discussed, and includes methods for realistic outdoor lighting. The system architecture is outlined, describing the algorithms used. Introduction Over recent years there has been much excitement about image based rendering, modelling and lighting [1,3,4,5,6,7]. The aim of this research is to create a system that allows dynamic objects to be lit accurately according to their environments, preferably using a global technique (i.e. everything in the surrounding scene can affect the given object), while still keeping the process fully real-time. The main benefit will is to obtain greatly enhanced realism, as well as better visual integration through the added objectscene interaction. This aim requires us to part with the ordinary way of doing real-time lighting; that is, with well-defined light sources. In real life, everything we are able to observe reflects light to a certain extent, and thus cannot be neglected in lighting calculations [1, 5] Disregarding the fact that other objects cast light onto each other is a great simplification, and the greatest limitation of real-time rendering has always been the necessary use of local illumination models. Local illumination accurately models what a given surface would look like considering one or a few point light sources, but does not take into account the interaction between illuminated surfaces. The downside of this is evident in the poor realism seen in most computer games and other real-time applications. In the last few years the use of lightmaps has become popular, since they allow static objects to have a radiosity solution baked onto the model as a texture. This way, the timeconsuming part of the lighting calculations is moved to a pre-processing step. Lightmaps work very well for buildings and such objects, but do not allow dynamic objects to be lit using global illumination. The fact that dynamic objects (for example player characters and opponents) are not lit as convincingly as their environments makes for poor and unrealistic integration. 29

Applications Our technique of using images to light virtual scenes has many possible application areas. Here we will focus on lighting outdoor scenery. Lighting outdoor scenes has always been a difficult area in creating computer images and especially in real-time graphics. As mentioned earlier, this is a consequence of using local illumination models. Mainly, these fail to model the fact that in outdoor scenes a large contribution of incoming light is reflected light, whereas local illumination models only regard direct lighting, i.e. light sources that have a direct view of the object. By capturing the light situation using a light probe [5], far more realistic lighting can be achieved. This kind of lighting has proven successful in, for example, motion pictures. All kinds of natural phenomena in outdoor scenes could potentially benefit from using image based methods for lighting. The System Figure 1 shows the basic dataflow of the system. As can be noted, it is divided into two main blocks offline image pre-processing and online real-time rendering. Figure 1: System architecture 30

Irradiance Mapping In order to achieve real-time lighting, the time consuming integrations are performed in a pre-processing step. The HDR panorama image (the radiance map) is used to calculate a diffuse map which, during rendering, is used as a lookup table by mapping a normal direction N( ) into an irradiance value I N (N). Both maps are stored as a latitudelongitude map, since that gives a one-to-one correspondence with spherical coordinates. Figure 2: Radiance map. The radiance map (see Figure 2) is a representation of the plenoptic function I(x, y, z,,,, t) as I R (, ). We assume that the interaction between an object at point P = (x, y, z) and the distant scene recorded in the irradiance map is one-way, i.e. we only consider light transport from the surrounding scene to the object and not vice versa. This is accurate as long as the distance between the object and the surrounding scene is sufficiently great. Figure 3: Diffuse map. The diffuse map (see Figure 3) can be explained as follows: For every direction ( ), given the point P where the radiance map was captured, there is an imaginary surface element with a normal N( ). The diffuse map then stores a measure of the incoming radiance onto the surface element for each direction N, which is calculated in as follows: 31

I N 2 0 I R (L) (N L)d d L (, ), L 1 Where L is the unit vector in direction (, where I R ( L is the intensity in the radiance map in direction L and N is the surface element normal. In Figure 4 we show how the integration limits comes from the hemisphere around the normal N. To produce an entire diffuse map, solve I N for all N and store the result at the pixel corresponding to N Figure 4: Hemisphere Lighting During rendering, the normal direction N of any given vertex in the object is used to look up the diffuse light intensity affecting it. Since the intensity values in the diffuse map are stored in spherical coordinates this is a straightforward process. The value is then used in combination with the vertex material properties to produce the final colour at the vertex. The algorithm is outlined below: For each vertex Get the normal N Convert N to spherical coordinates N = (, Look up the pixel value stored at (, in the diffuse map Light vertex according to material properties and the light intensity Store result as vertex colour End Results The system presented fulfils the goals set, being real-time and interactive. It implements a software image based lighting model, passing lit geometry to OpenGL for rendering. Inputs required are 3D-objects and HDR panoramas, making the process simple and straightforward. The panoramas are processed off-line in our custom image-processing 32

application to produce diffuse maps. This pre-processing removes the time-consuming integration involved in global illumination solutions allowing real-time performance during rendering. Figure 5: We demonstrate the difference between using a single point light source and using the algorithm described in this paper. Most importantly, the second image shows realism in two areas: matching light intensity with that perceived as coming from the surroundings, as well as matching the overall colour balance of the object to the panorama. We show two images which show the effect of colour blending. Great dynamism is evident in the second image where the light under the arcades is approximately 1000 times weaker than that coming from the somewhat cloudy sky. This gives rise to the truly dramatic, yet realistic, shading of the model. Figure 6: The dynamic properties of the system are shown in the above 3 images. Here, the model is clearly lit differently as it is spun to face different parts of the surrounding scene. The effect is clearly evident in the forehead of the model, as well as on its coat. 33

Conclusions and Future Work Further developments of this lighting technique can be divided into three main areas: improving the quality and speed of the rendering through hardware implementation, adding greater realism to the given system, and extending the system to allow completely dynamic scenes and views to be rendered [2]. In order to speed up the rendering process even further it could be implemented for consumer hardware with vertex and pixel programs. This would also allow the lighting to be calculated per pixel instead of per vertex, as is currently the case. The entire lighting process could then be performed using just a single texture unit, however this requires implementation of HDR texture support in hardware, as well as reprogramming the hardware s texture lookups. Yet greater realism could be achieved by adding reflections and specular highlights. Both could be pre-processed by the same image-processing software as used to create diffuse maps. This way, rendering quality close to what ray-tracing software produces is possible for dynamic objects, not just static ones. The system described is currently only able to display lighting situations in a single location. This would be quite sufficient to model outdoor lighting coming from the sun and clouds. A major advancement would be to extend the system into using many light situations. This would allow dynamic actors in a scene to be influenced by different light situations depending on where in a scene they are located. Acknowledgements Paul Debevec for providing the light probe images from http://www.debevec.org/. References [1] CHEN, W.-C., BOUGUET, J.-Y., CHU, M. H., AND GRZESZCZUK, R. 2002. Light field mapping: efficient representation and hardware rendering of surface light fields. In Proceedings of the 29th annual conference on Computer graphics and interactive techniques, ACM Press, pp 447 456. [2] DAMEZ, C., DMITRIEV, K., AND MYSZKOWSKI, K. 2002. Global illumination for interactive applications and high-quality animations. In Eurographics 2002 STAR Reports, 1 24. [3] COHEN, J., TCHOU, C., HAWKINS, T., DEBEVEC, P. 2001. Real-Time High Dynamic Range Texture Mapping. Rendering Techniques 2001: 12 th Eurographics Workshop on Rendering. pp. 313-320, 2001. [4] DEBEVEC, P. E. Rendering synthetic objects into real scenes. 1998. In Proceedings of the 25 th annual conference on Computer graphics. ACM Press.. [5] DEBEVEC, P. E., AND MALIK, J. Recovering high dynamic range radiance maps from photographs. 1997. In Proceedings of the 24 th annual conference on Computer graphics, ACM Press, pp. 369-378., [6] DEBEVEC, P. 1998. Rendering synthetic objects into real scenes: bridging traditional and image-based graphics with global illumination and high dynamic range photography. In Proceedings of the 25th annual conference on Computer graphics and interactive techniques, ACM Press, 189 198. 34