REAL-TIME GPU PHOTON MAPPING. 1. Introduction

Similar documents
Advanced Graphics. Path Tracing and Photon Mapping Part 2. Path Tracing and Photon Mapping

CS 563 Advanced Topics in Computer Graphics Irradiance Caching and Particle Tracing. by Stephen Kazmierczak

Final Project: Real-Time Global Illumination with Radiance Regression Functions

The Rendering Equation. Computer Graphics CMU /15-662

Global Illumination. COMP 575/770 Spring 2013

A Brief Overview of. Global Illumination. Thomas Larsson, Afshin Ameri Mälardalen University

Global Illumination. Why Global Illumination. Pros/Cons and Applications. What s Global Illumination

The Rendering Equation. Computer Graphics CMU /15-662, Fall 2016

Lecture 18: Primer on Ray Tracing Techniques

Photon Mapping. Michael Doggett Department of Computer Science Lund university

11/2/2010. In the last lecture. Monte-Carlo Ray Tracing : Path Tracing. Today. Shadow ray towards the light at each vertex. Path Tracing : algorithm

The Rendering Equation and Path Tracing

Ray Tracing. Kjetil Babington

Computer Graphics. Lecture 13. Global Illumination 1: Ray Tracing and Radiosity. Taku Komura

Ray Tracing. Computer Graphics CMU /15-662, Fall 2016

Lighting. To do. Course Outline. This Lecture. Continue to work on ray programming assignment Start thinking about final project

Monte Carlo Ray-tracing and Rendering

Rendering: Reality. Eye acts as pinhole camera. Photons from light hit objects

CPSC GLOBAL ILLUMINATION

Path Tracing part 2. Steve Rotenberg CSE168: Rendering Algorithms UCSD, Spring 2017

Rendering. Mike Bailey. Rendering.pptx. The Rendering Equation

Korrigeringar: An introduction to Global Illumination. Global Illumination. Examples of light transport notation light

Raytracing & Epsilon. Today. Last Time? Forward Ray Tracing. Does Ray Tracing Simulate Physics? Local Illumination

Recollection. Models Pixels. Model transformation Viewport transformation Clipping Rasterization Texturing + Lights & shadows

Photon Mapping. Due: 3/24/05, 11:59 PM

Computer Graphics. Lecture 10. Global Illumination 1: Ray Tracing and Radiosity. Taku Komura 12/03/15

CENG 477 Introduction to Computer Graphics. Ray Tracing: Shading

Ray Tracing Performance Derby Final Project

Consider a partially transparent object that is illuminated with two lights, one visible from each side of the object. Start with a ray from the eye

Assignment 6: Ray Tracing

Advanced Graphics. Global Illumination. Alex Benton, University of Cambridge Supported in part by Google UK, Ltd

Global Illumination CS334. Daniel G. Aliaga Department of Computer Science Purdue University

Chapter 11 Global Illumination. Part 1 Ray Tracing. Reading: Angel s Interactive Computer Graphics (6 th ed.) Sections 11.1, 11.2, 11.

Bioluminescence Chris Fontas & Forrest Browning

MITOCW MIT6_172_F10_lec18_300k-mp4

Global Illumination. CSCI 420 Computer Graphics Lecture 18. BRDFs Raytracing and Radiosity Subsurface Scattering Photon Mapping [Ch

Global Illumination. Global Illumination. Direct Illumination vs. Global Illumination. Indirect Illumination. Soft Shadows.

Illumination. The slides combine material from Andy van Dam, Spike Hughes, Travis Webb and Lyn Fong

NVIDIA Case Studies:

COMP 4801 Final Year Project. Ray Tracing for Computer Graphics. Final Project Report FYP Runjing Liu. Advised by. Dr. L.Y.

Global Illumination. Global Illumination. Direct Illumination vs. Global Illumination. Indirect Illumination. Soft Shadows.

To Do. Real-Time High Quality Rendering. Motivation for Lecture. Monte Carlo Path Tracing. Monte Carlo Path Tracing. Monte Carlo Path Tracing

Photorealistic 3D Rendering for VW in Mobile Devices

Schedule. MIT Monte-Carlo Ray Tracing. Radiosity. Review of last week? Limitations of radiosity. Radiosity

CS354R: Computer Game Technology

Stochastic Path Tracing and Image-based lighting

Assignment 3: Path tracing

Ray Tracing. CSCI 420 Computer Graphics Lecture 15. Ray Casting Shadow Rays Reflection and Transmission [Ch ]

Global Rendering. Ingela Nyström 1. Effects needed for realism. The Rendering Equation. Local vs global rendering. Light-material interaction

Photo Studio Optimizer

Computer Vision. The image formation process

MIT Monte-Carlo Ray Tracing. MIT EECS 6.837, Cutler and Durand 1

Recent Advances in Monte Carlo Offline Rendering

Virtual Reality for Human Computer Interaction

General Algorithm Primitives

Review for Ray-tracing Algorithm and Hardware

Topic 12: Texture Mapping. Motivation Sources of texture Texture coordinates Bump mapping, mip-mapping & env mapping

Part I The Basic Algorithm. Principles of Photon Mapping. A two-pass global illumination method Pass I Computing the photon map

Photorealism: Ray Tracing

Chapter 11. Caustics and Global Illumination

Lecture 12: Photon Mapping. Biased Methods

Simple Nested Dielectrics in Ray Traced Images

COMP30019 Graphics and Interaction Ray Tracing

Topic 11: Texture Mapping 11/13/2017. Texture sources: Solid textures. Texture sources: Synthesized

Local vs. Global Illumination & Radiosity

Monte-Carlo Ray Tracing. Antialiasing & integration. Global illumination. Why integration? Domains of integration. What else can we integrate?

Chapter 4- Blender Render Engines

SAMPLING AND NOISE. Increasing the number of samples per pixel gives an anti-aliased image which better represents the actual scene.

Problem Set 4 Part 1 CMSC 427 Distributed: Thursday, November 1, 2007 Due: Tuesday, November 20, 2007

Illumination Algorithms

Topic 11: Texture Mapping 10/21/2015. Photographs. Solid textures. Procedural

Real-Time Ray Tracing Using Nvidia Optix Holger Ludvigsen & Anne C. Elster 2010

GLOBAL ILLUMINATION. Christopher Peters INTRODUCTION TO COMPUTER GRAPHICS AND INTERACTION

COMP371 COMPUTER GRAPHICS

Photon Maps. The photon map stores the lighting information on points or photons in 3D space ( on /near 2D surfaces)

Indirect Illumination

Effects needed for Realism. Computer Graphics (Fall 2008) Ray Tracing. Ray Tracing: History. Outline

Ray Tracing Acceleration Data Structures

Choosing the Right Algorithm & Guiding

Ø Sampling Theory" Ø Fourier Analysis Ø Anti-aliasing Ø Supersampling Strategies" Ø The Hall illumination model. Ø Original ray tracing paper

CMSC427 Advanced shading getting global illumination by local methods. Credit: slides Prof. Zwicker

Recursion and Data Structures in Computer Graphics. Ray Tracing

TDA361/DIT220 Computer Graphics, January 15 th 2016

03 RENDERING PART TWO

Computer Graphics. - Ray Tracing I - Marcus Magnor Philipp Slusallek. Computer Graphics WS05/06 Ray Tracing I

An introduction to Global Illumination. Tomas Akenine-Möller Department of Computer Engineering Chalmers University of Technology

The Rendering Equation & Monte Carlo Ray Tracing

Project 3 Path Tracing

Global Illumination. CMPT 361 Introduction to Computer Graphics Torsten Möller. Machiraju/Zhang/Möller

Indirect Illumination

CS 465 Program 5: Ray II

CMSC427 Shading Intro. Credit: slides from Dr. Zwicker

Outline of Lecture. Real-Time High Quality Rendering. Geometry or Vertex Pipeline. Basic Hardware Pipeline. Pixel or Fragment Pipeline

CS580: Ray Tracing. Sung-Eui Yoon ( 윤성의 ) Course URL:

Rendering Algorithms: Real-time indirect illumination. Spring 2010 Matthias Zwicker

Practical 2: Ray Tracing

Raytracing CS148 AS3. Due :59pm PDT

Rendering Part I (Basics & Ray tracing) Lecture 25 December 1, 2015

Ambien Occlusion. Lighting: Ambient Light Sources. Lighting: Ambient Light Sources. Summary

So far, we have considered only local models of illumination; they only account for incident light coming directly from the light sources.

Transcription:

REAL-TIME GPU PHOTON MAPPING SHERRY WU Abstract. Photon mapping, an algorithm developed by Henrik Wann Jensen [1], is a more realistic method of rendering a scene in computer graphics compared to ray and path tracing. Unlike ray tracing, photon mapping first emits lots of photons from light sources in a scene and then uses the photon map to calculate the radiance of each pixel in the rendered scene. Previous work in the area include a GPU photon mapper written by Stanford Graphics [1], but there do not exist any real-time renderers yet. For this project, a real-time GPU photon mapper using Nvidia CUDA will be implemented. 1. Introduction Photon mapping is an algorithm that builds on a more primitive algorithm, ray tracing, used for rendering scenes. This algorithm works by discretizing the image plane, shooting a ray through each discretized unit of the image plane, and testing whether each ray intersects with some object in the scene. If the ray does hit an object, then for each light source, the direction from the intersection to the light source and the color of the light source at the intersection are noted, and are used by the material of the intersected object to compute shading. This process solves the Rendering Equation: L o (x, ω o, λ, t) = L e (x, w o, λ, t) + f r (x, ω i, ω o, λ, t)l i (x, ω i, λ, t)(ω i n)dω i Ω The ray tracing algorithm can be extended to support visual effects to make the resulting image more realistic, such as shadows, reflection, refraction, and depth of field. It is also an embarrassingly parallel algorithm, but is still computationally intensive, depending on the visual effects desired in the result. Photon mapping builds on ray tracing. As can be seen in the juxtaposed images below, the right shows the photon map and the right shows the rendered image, which utilizes the photon maps to calculate the extra brightness on the red, white, and blue walls. 1

The photon maps are built in the first step. Two maps are produced, one for diffuse photons, another for caustics. The diffuse photons are generated as follows. A photon starts at a light source and follows a ray predetermined to the light source s distribution. We restrict the discussion to a point Lambertian source, which uniformly samples a ray on a unit hemisphere. The photon follows that ray until it hits an object. If the photon hits a diffuse object, then it is stored in the diffuse map. We then decide whether the photon bounces off the object or absorbed by a simple sampling algorithm, Russian Roulette, which decides between a pair of events based on a random number. The caustic photons are similarly generated. In the second step, the image is rendered similarly to the ray tracing algorithm. The main difference lies in getting the color from the light: in addition to computing the shaded color from the material, the colors of the k nearest photons are averaged and included. After the calculation is done for each pixel, the result is written to a file on disk or displayed. 2. Background and Previous work Several groups have built photon mappers for the GPU, including Wann Jensen himself [1], but few have utilized spatial hashing [2], or simply hashing in higher dimensions. One advantage the spatial hash has over the octree, the more conventional photon storage data structure, is that its structure is linear: that is, no pointer indirections are required to traverse the data structure. This linear structure also makes memory management easier, it can be copied back and forth from the GPU with one call to cudamemcpy and similarly can be allocated and freed with one call to the respective instructions. Several researchers at Microsoft Research have worked on a perfect spatial hash given a scene [3]. However, this implementation aims to be flexible and have a spatial hash that performs well for arbitrary scenes. 3. Implementation The photon mapper is written in C++ with CUDA, Nvidia s framework for general-purpose GPU work. We chose GPUs instead of x86 CPUs because GPUs offer way more compute power than CPUs for the same price, as can be seen in the following chart. 2

In particular, GPUs are able to achieve this kind of performance because they have hundreds of shaders, analogous to cores, on one die. These shaders, while primitive, are incredibly fast at simple tasks, such as math, which is the bulk of the work in ray tracing and photon mapping. 3.1. CUDA. CUDA, or Compute Unified Device Architecture, has a collection of functions to interface with the GPU. It supports a subset of C and C++ functionality on the GPU. In particular, recursion is not supported due to a limited stack on the GPU. Since the ray tracing algorithm takes as input a fixed number of bounces, that can be a constant and recursion can then be simulated using template metaprogramming. 3.2. Spatial Hashing. The spatial hash implemented in this version of the photon mapper is quite naive. In the scene space, we arbitrarily bound the space into a cube and then partition it into some number k 3, k N of identical smaller cubes. The hashes of photons that lie within the cube are computed based on the position (x, y, z) of the photon as follows: (1) h(x, y, z) = k 2 ( x + k) + k( y + k) + ( z + k). For the few photons that are outside of the cube, the coordinates are clamped to be within the cube before feeding the values into (1). The spatial hash keeps track of the photon IDs that are in each of the subcubes. During the global illumination and caustic illumination computation, given the intersection position x, the renderer will compute the luminance by averaging the power of the photons in the subcube that contains x. Although this is not a true solution to the k-nearest-neighbors problem, for a sufficient number of photons in the subcube, this strategy gives a sufficiently good estimate of the average of the k nearest photons, assuming the colors of the extra photons do have spatial locality and are not completely random. 4. Results Currently, the ray tracing component has been tested and benchmarked. In particular, three scenes have been used for testing, ranging from simple to complex: The left scene contains four smaller diffuse balls. Again, reflections can be sceen in the central red ball, which now reflects the green ball and the arrangement (from the reflection on the plane). 3

The right scene contains nine additional small specular balls, which were set up to be like mirrors. In addition, the larger balls are now refractive. This was designed to both tax the GPU in addition to test refraction. In addition, benchmark results were obtained for the left scene: The graph indicates that there is a large overhead in setting up the GPU, but the actual compute time is minimal once the data is on the GPU. Note that the exponential scaling is much less apparent on the GPU than on the CPU, which shows that distributing work over the shaders is quite effective. The larger input sizes are of interest for further developing the program into a feasible real-time renderer, especially for high screen resolutions such as 1920x1080, 2560x1600, and possibly 3840x2400. The photon mapping portion has not been tested, although preliminary dumps of the diffuse map have shown its associated code to be functional. 5. Conclusion The goal of the project was to write a GPU photon mapper that implements spatial hashing as an indepth introduction to GPU programming and photon mapping. The project was implemented in stages, first a primitive ray caster, and then a ray tracer with no special effects, and at this time of writing, a ray tracer that supports reflections and refractions. While coding, much emphasis was placed on getting the code correct the first time since there was relatively little time to debug and that debugging giant swaths of code is painful, so unit tests were also developed in parallel to ensure correctness. With this project, we hope to demonstrate the sheer power of GPUs, which are relatively inexpensive per teraflop compared to high-end multicore x86 chips. In addition, we hope to revolutionize graphics in games by taking advantage of the additional GPU power. 6. Future Work Because of the short time given for this project, there are several optimizations and extensions that can be made. Two immediate optimizations that can be done are twiddling the block and grid sizes (work distribution parameters for the GPU) to maximize performance. Calls to malloc can be replaced with 4

cudahostmalloc, which does more housekeeping, but has been shown to reduce copy times from host to device. Instead of storing the color as a vector3, which is 16 bytes, we can store the wavelength of light. Since the visible spectrum is from 400nm to 700nm, we can store the wavelength as an unsigned char, which is 1 byte, at the cost of omitting 15% of the spectrum. This not only conserves memory on the host and the device, but also cuts down time spent transferring data back between the CPU and the GPU, a costly operation. Furthermore, instead of using shade and shadeambient methods for shading, we can use bidirectional reflectance distribution functions, which leads to a more general implementation, which can be extended to more interesting materials, such marble, water, and skin. Acknowledgements I would like to thank Alan Edelman and Jeff Bezanson for giving me the opportunity to research photon mapping for 18.337. I would also like to thank Charles Leiserson, Saman Amarasinghe, and Bayley Wang for exposing me to photon mapping, ray tracing, and high performance computing. References [1] Timothy J. Purcell, Craig Donner, Mike Cammarano, et al., Photon Mapping on Programmable Graphics Hardware. Stanford University and University of California, San Diego, 2003. [2] Sanket Gupte, Real-Time Photon Mapping on GPU. University of Maryland Baltimore County, 2011. [3] Sylvain Lefebvre, Hugues Hoppe, Perfect Spatial Hashing Microsoft Research, 2006. 5