Real-time High Dynamic Range Image-based Lighting

Size: px
Start display at page:

Download "Real-time High Dynamic Range Image-based Lighting"

Transcription

1 Real-time High Dynamic Range Image-based Lighting César Palomo Department of Computer Science PUC-Rio, Rio de Janeiro, Brazil ABSTRACT In this work we present a real time method of lighting virtual objects using measured scene radiance. To compute the illumination of CG objects inserted in a virtual environment, the method uses previously generated high dynamic range (HDR) image-based lighting information of the scene, and performs all calculations in graphics hardware to achieve real-time performance. This work is not intended to be a breakthrough in real-time processing or in HDR image-based lighting (IBL) of CG objects, but still we expect to provide the reader with the overall IBL technique knowledge upto-date, an in-depth understanding of the HDR rendering (HDRR) pipeline in current graphics cards hardware, and interesting images post-effects which deliver higher realism for the scene final viewer. Keywords Image-based lighting, High Dynamic Range, Real-time Rendering, Shading, Graphics Hardware Programming 1. INTRODUCTION We describes the use of a previously captured HDR map with lighting information of a scene to illuminate computergenerated virtual objects as if they were seamlessly placed in the captured scene. All computations are performed using graphics hardware, by means of extensive use of vertex and fragment shaders in order to enable for real-time interaction and display. We provide effects like reflection, refraction, fresnell effect and chromatic dispersion [29] as options for the lighting equation. Furthermore, post-effects such as bloom [12] are produced to give the final scene a more realistic appearance. We make the simplification of not performing inter-reflection among virtually computer-generated objects to achieve a high frame-rate. Specially video games can benefit from this technique as it creates more realistic scenes than with standard lighting, which allowed developers and artists to produce incredible effects as in games like Half-Life 2:Lost Coast (Valve Software R ), Oblivion (Bethesda Softworks R ) and Far Cry (Ubisoft R ), to make the list short. The rest of this paper is organized as follows. In the next section we discuss related work relevant to this paper. Section 3 provides background about the image-based lighting technique and how lighting calculations are done, since this is the core of our method of lighting the virtual objects. Section 4 presents the concepts involved in HDR rendering: acquiring HDR images, storage, rendering, tone mapping and commonly used post-effects. Section 5 describes in depth the general method used in this work, joining the use of graphics hardware and techniques IBL and HDRR described in previous sections to produce real-time and realistic scenes. Section 6 presents results obtained, and we conclude in section RELATED WORK Debevec [5] uses high-dynamic range images of incident illumination to render synthetic objects into real-world scenes. However, that work employed non-interactive global illumination rendering to perform the lighting calculations. Significant work has been developed to approximate these full illumination calculations in real-time by using graphics hardware. As a particular example, [17] and [18] use multipass rendering methods to simulate arbitrary surface reflectance properties. In this work we use a multi-pass rendering method to perfom all the illumination and post-effects calculations to render the final scene at a highly interactive frame rate, similar to Kawase s 2003 work [19]. 3. IMAGE-BASED LIGHTING TECHNIQUE Image-based lighting can be summarized as the use of realworld images of a scene to make up a model representing a surrounding surface, and the later use of this model s lighting characteristics to correctly illuminate added subjects to the 3D scene. From the explanation above, we can identify two main decisions which need to be made when using IBL: how to represent the surrounding scene into a model and how to perform the illumination calculations of added subjects into the scene. Subsection 3.1 discusses the main kinds of environment maps commonly used with IBL, and also makes a brief comparison among them, while subsection 3.2 lists the lighting effects used in this work. 3.1 Environment Mapping Techniques

2 In short, environment mapping (EM ) simulates objects reflecting its surroundings. This technique assumes that an object s environment is infinitely far from the object, and that there is no self-reflection. If that assumptions hold, the environment surrounding the subject can be encoded and modeled in an omnidirectional image known as an environment map. The method was introduced by Blinn and Newell [3]. All EM methods start with a ray frow the viewer to a point in the reflector. This ray, then, is reflected or refracted with respect to the normal at that point. This resulting direction is used as an index to an image containing the environment to determine the color for the point on the surface Spherical Environment Mapping The early method described in 1976 by Blinn and Newell [3] is known as Spherical Environment Mapping. For each environment-mapped pixel, the reflection vector is transformed into spherical coordinates, which in turn are transformed to the range [0, 1] and used as (u, v) coordinates to access the environment texture. Despite of being easy to implement, this technique has several disadvantages as described in [1], such as the limitations of view-point dependency, distortions in the environment map s poles and the required computational time Cubic Environment Mapping Ten years later, in 1986, Greene [8] introduced the EM technique which is by far the most popular method implemented in modern graphics hardware, due to its speed and flexibility. The cubic environment map is obtained taking 6 projected faces of the scene that surrounds the object. This cube map shape allows for linear mapping in all directions to six planar texture maps. For that reason, the resulting reflection does not undergo the warping nor damaging singularities associated with a sphere map, particularly at the edges of the reflection. For its characteristics of being view-independent, not presenting singularities and common implementation in current graphics hardware, cube maps have been our choice in this work. 3.2 Lighting calculations Having made the decision of how to represent the model of the surrounding scene, the illumination model for the IBL technique still needs to be chosen. Below we briefly present the physical basis for the selected illumination models available in this work. A deeper explanation can be found in [29] Reflection R is used to access the cube map texture in the correct face. If we assume that the object is a perfect reflector such as a mirror, the vector R can be computed in terms of the vectors I and N with Equation Refraction R = I 2N(N.I) (1) When light passes through a boundary between two materials of different density (air and glass, for instance), the light s direction changes, since light travels more slowly in denser materials. Snell s Law describes what happens in this boundary with Equation 2. η 1 and η 2 are the refraction index for media 1 and 2, respectively. η 1 sin θ I = η 2 sin θ T (2) Basically, we use the built-in GLSL function refract to compute vector T to be used to lookup the environment map. Vector T is calculated in terms of the vectors I, N and the ratio of the index of refraction η 1/η Fresnell effect In real scenes, when light hit a boundary between two materials, some light reflects off the surface and some refracts through the surface. The Fresnell equation describes this phenomenon precisely, but since it is a complex equation, it is common to use the simplified version depicted in Equation 3. In this case, both reflection and refraction vectors are calculated and used to look up the environment map, but the resulting color in the incident point is calculated as shown in Equation 4 reflcoef = max(0, min(1, bias + scale (1 + I N) power )) (3) finalcolor = reflcoef reflcol+(1 reflcoef) refraccol (4) Chromatic dispersion When an incident vector I from the viewer s position reaches the object s surface in a point P, the reflection vector R is calculated taking into account the normal N at point P and the incident angle θ I, as depicted in figure This vector

3 The assumption that the refraction depends on the surface normal, incident angle and ratio of indices of refraction is in fact a simplification for what happens in reality. In addition to the mentioned factors, also the wavelength of the incident light affects the refraction. This phenomenon is known as chromatic dispersion, and it is what happens when white light enters a prism and emerges as a rainbow. Figure illustrates chromatic dispersion conceptually. The incident illumination (assumed to be white) is split into several refracted rays. That effect can be simulated in computers by making the simplification that the incident vector I is split into 3 refracted vectors, one for each RGB channel, namely T Red, T Green and T Blue. So, to calculate the resulting illumination, 3 ratios of indices of refraction are provided and used, one for each channel. 4. HIGH DYNAMIC RANGE IMAGING AND RENDERING Usual digital images are created in order to be displayed on monitors which support up to 16.7 million colors (24bits). For that reason, it is logical to store numeric images which match the color range of the display. For instance, image formats like bmp and jpeg traditionally use 16, 24 or 32 bits for each pixel. These image formats are known to have a low dynamic range, or LDR. HDR images basically use a wider dynamic range, which allows pixels to represent a larger contrast and a larger dynamic range. When a picture is taken with a conventional camera, the exposure time chosen by the photographer indicates which areas should be taken into account and therefore be captured by the picture, while the other areas are ignored. For instance, if a photographer chooses a long exposure time, bright areas will not be correctly captured by the picture, while details in dark areas will be visible in the final image. In constrast, if a short exposure time is set, only bright areas will reach the sensors at the camera, and so they will be the only areas with depicted details in the final image. In short, HDR rendering avoids any clipping of the values and everything is calculated more accurately. The result is an image that resembles reality more closely because it can contain both extremely dark and extremely bright areas. It allows the user to set a specific exposure time and simulate real photographs as if taken of the real scene. 4.1 Acquiring and Storage Format An early source of high dynamic range images in computer graphics were the renderings created by global illumination and radiosity algorithms. For instance, RADIANCE synthetic imaging system by Greg Ward [27] outputs its rendering in Ward s Real Pixels format [26]. Later, Debevec would present in his seminal paper [4] a method for acquiring real-world radiance using standard digital cameras photography. We use HDR images acquired with this technique in this paper, stored in RGBE format [28],[10]. 4.2 Tone Mapping Since conventional current monitors have limitations of contrast and dynamic range, with common contrast ratios between 500:1 and 3000:1, and HDR images have pixel values which by far exceed those limitations, we need to use a method called Tone Mapping to display those images in these conventional monitors. These tone mapping operators are simply functions which map values from [0; ) to [0; 1). Several operators have been developed and give different results, as can be seen in [7], [20], [22], [23] and [2]. Some of them can be simple and efficient to be used in real-time, while others can be very complex and customizable for image editing. The tone mapping operator used in this work is described in Equation 5, where Exp is the exposure level for the scene and Bright Level is a maximum luminance value for the scene. This operator allows the user to control exposure: with small exposure, the image is dark and some hidden details in bright areas appear, and conversely, with a high exposure, the image is very bright and contains details in dark areas. Y = Exp Exp Bright Level + 1 Exp Blooming A very common effect applied to HDR images is called bloom [12]. It enhances bright areas and highlights, giving a vivid impression to an image. To achieve a good blooming result as explained in [15], successive gaussian blur median filters [13] must be applied, with different kernel sizes. However, applying filters with big kernels does not allow for real-time performance, so an alternative approach is usually taken. With bilinear interpolation on, a gaussian filter is applied to the original image. Then this image is downsized (usually by powers of 2), and the gaussian filter is applied again to the resulting image. This process continues until a tiny image is obtained and no further downsizing improves the effect. Then, all these images are blended altogether, delivering a great blooming result. The trick of the technique is that, due to the image downsampling, different gaussian filter s kernels can be achieved with little expense. In fact, if a original texture of 128x128 is downsized to 64x64, then to 32x32 and finally to 16x16, and if a 3x3 kernel is used to blur all images, the tiniest texture, with 16x16 dimensions, blurs as if a 24x24 kernel were applied to the original texture, but using much less computations, and for that reason allowing for better performance. 4.4 Rendering Pipeline As already stated, HDR images have values beyond the [0, 1] range. For that reason, the normal 3D Graphics Pipeline can not be used to render those images, otherwise the image pixels values would be clamped to [0, 1] range. With the support for floating-point textures, arithmetic and render targets in graphics hardware, HDRR became possible, and no workarounds would be necessary such as those (5)

4 previously suggested by Debevec et al [6]. There are many methods for implementing HDR Rendering in current graphics hardware, but the following pipeline is very commonly used in OpenGL 2.0 API [16], which includes the application of bloom effect [12]: The HDR image is sent to the graphics card using the GL RGBA16F internal storage format, so that pixels values do not get clamped to [0, 1] range. To apply the bloom effect [12], first of all very bright areas of the image are extracted in a HDR Frame Buffer Object (FBO). This texture is then downsampled, with bilinear filtering active, into other textures. These down-sampled textures are blurred with a median filter shader [13]. The original HDR texture is composited by adding all down-sampled blurred textures with additive blending. In this phase we obtain a HDR Frame Buffer Object with the desired bloom effect, but its associated color texture still contains high-dynamic range values. Tone mapping is applied to the resulting HDR framebuffer to convert it to an LDR image, allowing for display by the conventional framebuffer. The operator described in Equation 5 was implemented in this work using the GLSL [24] shading language. Finally, a quad [16] is drawn with the LDR texture generated in the previous step active, so that the final image can be displayed for the viewer. 5. GENERAL METHOD To achieve realtime HDR IBL in this work, we apply a rendering pipeline consisting of the steps depicted below. For the sake of clarity, some of them are described in following subsections. A HDR cross image RGBE file [28], representing the environment scene, is loaded into memory. The 6 faces of this cube map image are extracted into 6 different images, corresponding to positive and negative directions for x, y and z axis [8]. A cube map texture is created using OpenGL s 2.0 GL RGBA16F ARB internal format [16], and each extrated face is sent to graphics hardware into its corresponding face of the cube map texture. Useful shaders for performing the lighting, applying the bloom effect and tone mapping the final HDR image are created. More details are provided in subsection 5.1. With one framebuffer object [16] FBO1 active, the cube map texture is applied to a big cube, which works as a skybox for the scene [14], representing the environment around the virtual objects. The virtual objects are positioned appropriately into the scene and their rendering is done still with FBO1 active and with the selected lighting effect (reflection, refraction, fresnell effect or chromatic dispersion) shader active, and with the cube map texture active and bound. At the end of this phase, FBO1 contains the rendered HDR scene in its color texture [16], with the virtual objects already illuminated accordingly to the environment. Another framebuffer object FBO2 and the shader responsible for extracting bright areas from a HDR image are activated, and with FBO1 s color texture active, a quad is [16] drawn. At this point FBO2 s color texture contains an HDR image with only bright areas. To apply the gaussian blur median filter, FBO2 s color texture is down-sampled into some other framebuffer objects FBO DS to allow for better blooming results [15]. At the end of this phase the collection of framebuffer objects FBO DS contains blurred HDR images, downsized from the original one. FBO DS color textures are composited with active blending into framebuffer object FBO2. That results in the bloom effect which can be added to the final image. Finally, FBO2 is composited with FBO1 into the very same FBO1, using the tone mapping shader to perform the HDR to LDR conversion. To display the final rendered scene, FBO1 s color texture is activated and a quad is drawn. 5.1 Shaders In this work, seven shaders (seven pairs of fragment and vertex shaders) perform all the illumination calculations, apply the bloom effect and tone map the final HDR image. All of them have been developed using the GLSL shading language [24]. Four shaders perform the illumination calculations of reflection, refraction, fresnell effect and chromatic dispersion [29] each. Two shaders are part of the bloom effect [12]: one extracts bright areas of an HDR image and the other applies the gaussian blur filter to an HDR image, outputing a blurred HDR image. To perform the tone mapping operator presented in Equation 5, there is one last shader whose output is an LDR image which can be displayed in the conventional OpenGL s framebuffer [16]. 6. RESULTS In a Intel Duo Core R 1.83MHz with a NVIDIA GeForce R Go 7600 graphics card, the technique implemented as presented here performed a frame-rate of 60 FPS at 640x480 screen resolution, lighting a model with 4500 triangles faces. Even at 1400x900 screen resolution, it can run at 20 FPS illuminating the same subject, with either available illumination effect, clearly mantaining a highly interactive rate.

5 This work provides a higher frame-rate than Kawase s one [19], which runs at 45 FPS at 640x480 screen resolution in the same machine, but with other effects like depth of field [11] and glare [25]. At 1400x900, its performance drops to 15 FPS, clearly showing that the addition of further effects really impairs the final performance of rendered scenes with HDR. The main dificulties in the course of this work have been finding good OpenGL FBO implementation examples and problems to debug the written GLSL shaders. After extense research, good code samples have been found for FBO implementation on OpenGL. To solve the issues on shaders, GLIntercept function call interceptor has been used. It could show textures status at each frame, allowing for debugging the results for each phase of the proposed pipeline for HDRR, easing the development process. In the end of this article we show some snapshots for results of our work. They illustrate how the bloom effect provides a more realistic impression of the scene, and how each one of the illumination effect applied to the virtual subject contributes differently to the resulting scene. 7. CONCLUSION In this work we presented a consolidated method for illuminating virtual objects inserted in a scene whose lighting characteristics are captured in a HDR image, using this environment lighting information to correctly calculate the shading of the CG subjects in real-time using graphics hardware computation power. We achieved a real-time performance as intentioned, and the final images presented to the viewer accomplished the aim of compositing virtual objects with a real-scene environment without viewer s noticing what is virtual and what is real, due to good illumination calculations made possible with the HDR values from the environment. The assumption that there is no inter-reflection among virtual subjects or with the surrounding scene is a necessary limitation in our work to enable for real-time performance. As a future work, this assumption could be mitigated by using the proposed method in Hakura s and Snyder s work [9], where they use a combination of minimal ray tracing for local objects and layered environment maps to produce reflections and refractions tha closely match fully ray traced solutions. Using their technique associated with graphics hardware programming through shaders programming would probably result in good visual results and improved performance if compared to their approach. Some improvements to this work, not done yet due to lack of time, could be the inclusion of other effects beside blooming, like depth of field [11] and glare [25], which can imitate some cameras lenses s physical characteristics in response to light, and can bring even more reality to the final composited scene.

6 8. REFERENCES [1] AKENINE-MOLLER, T., AND HAINES, E. Real-time rendering. (2002), 2nd edition, pp ISBN [2] ASHIKMIN, M. A Tone Mapping Algorithm for High Contrast Images. In 13th Eurographics Workshop on Rendering (2002) [3] BLINN, J.F., And M.E NEWELL. Texture and reflection in computer generated images. Communications of the ACM (October 1997), vol. 19, no. 10, pp [4] DEBEVEC, P.E., And MALIK, J. Recovering high dynamic range radiance maps from photographs. In SIGGRAPH 97 (August 1997), pp [5] DEBEVEC, P. Rendering synthetic objects into real scenes: bridging traditional and image-based graphics with global illumination and high dynamic range photography. In SIGGRAPH 98 (July 1998). [6] COHEN, J., TCHOU, C., HAWKINS, T., And DEBEVEC P. Real-time high dynamic range texture mapping. In Proceedings of the Eurographics Rendering Workshop (2001). [7] DRAGO, F., MYSZKOWSKI, K., ANNEN, T. And CHIBA, N. Adaptive Logarithmic Mapping for Displaying High Contrast Scenes. In Eurographics 2003 [8] GREENE, NED. Environment Mapping and Other Applications of World Projections. IEEE Computer Graphics and Applications (November 1986), vol. 6, no. 11, pp [9] HAKURA, ZIYAD S. And SNYDER, J. M. Realistic Reflections and Refractions on Graphics Hardware With Hybrid Rendering and Layered Environment Maps. In 12th Eurographics Workshop on Rendering (2001), pp [10] HOLZER, B. High Dynamic Range Image Formats. (2006). [11] [12] (shader effect) [13] blur [14] (video games) [15] [16] [17] KAUTZ, J., AND MCCOOL, M. D. Approximation of glossy reflection with prefiltered environment maps. Graphics Interface (2000), pp ISBN [18] KAUTZ, J., AND MCCOOL, M. D. Interactive rendering with arbitrary BRDFs using separable approximations. Eurographics Rendering Workshop 1999 (June 1999). [19] KAWASE, M. Real-Time High Dynamic Range Image-Based Lighting. masa/rthdribl/ [20] PATTANAIK, S.N., TUMBLIN, J., YEE, H. And GREENBERG D.P. Time-Dependent Visual Adaptation for Realistic Image Display. In Proceedings of ACM SIGGRAPH (2000) [21] RANDIMA FERNANDO And MARK J. KILGARD The Cg Tutorial: The Definitive Guide to Programming Real-Time Graphics. ISBN-10: ISBN-13: [22] REINHARD, E. And DEVLIN, K. Dynamic Range Reduction Inspired by Photoreceptor Physiology. In IEEE Transactions on Visualization and Computer Graphics (2004) [23] REINHARD, E., STARK, M., SHIRLEY, P. And FERWERDA J. Photographic Tone Reproduction for Digital Images. In ACM Transactions on Graphics (2002) [24] ROST, RANDI J. OpenGL Shading Language. 2nd Edition. ISBN-10: ISBN-13: [25] SPENCER, G., SHIRLEY, P., ZIMMERMAN K., And GREENBERG D. P. Physically-Based Glare Effects for Digital Images. In SIGGRAPH (1995) [26] WARD, G. Real Pixels. Graphics Gems II (1991), [27] WARD, G. J. The RADIANCE lighting simulation and rendering system. In SIGGRAPH 94 (July 1994), pp [28] WARD, G. J. High Dynamic Range Image Encodings. [29] WESLEY, ADDISON Environment Mapping Techniques. /

7 Figure 1: Effects illustrations of this work: reflection (top-most, left), refraction (top-most, right), fresnell effect (two images in the middle), and chromatic dispersion (two bottom-most images)

CMSC427 Advanced shading getting global illumination by local methods. Credit: slides Prof. Zwicker

CMSC427 Advanced shading getting global illumination by local methods. Credit: slides Prof. Zwicker CMSC427 Advanced shading getting global illumination by local methods Credit: slides Prof. Zwicker Topics Shadows Environment maps Reflection mapping Irradiance environment maps Ambient occlusion Reflection

More information

Computergrafik. Matthias Zwicker Universität Bern Herbst 2016

Computergrafik. Matthias Zwicker Universität Bern Herbst 2016 Computergrafik Matthias Zwicker Universität Bern Herbst 2016 Today More shading Environment maps Reflection mapping Irradiance environment maps Ambient occlusion Reflection and refraction Toon shading

More information

Render all data necessary into textures Process textures to calculate final image

Render all data necessary into textures Process textures to calculate final image Screenspace Effects Introduction General idea: Render all data necessary into textures Process textures to calculate final image Achievable Effects: Glow/Bloom Depth of field Distortions High dynamic range

More information

Real-Time Image Based Lighting in Software Using HDR Panoramas

Real-Time Image Based Lighting in Software Using HDR Panoramas Real-Time Image Based Lighting in Software Using HDR Panoramas Jonas Unger, Magnus Wrenninge, Filip Wänström and Mark Ollila Norrköping Visualization and Interaction Studio Linköping University, Sweden

More information

Computer Graphics. Lecture 9 Environment mapping, Mirroring

Computer Graphics. Lecture 9 Environment mapping, Mirroring Computer Graphics Lecture 9 Environment mapping, Mirroring Today Environment Mapping Introduction Cubic mapping Sphere mapping refractive mapping Mirroring Introduction reflection first stencil buffer

More information

Introduction to Visualization and Computer Graphics

Introduction to Visualization and Computer Graphics Introduction to Visualization and Computer Graphics DH2320, Fall 2015 Prof. Dr. Tino Weinkauf Introduction to Visualization and Computer Graphics Visibility Shading 3D Rendering Geometric Model Color Perspective

More information

lecture 18 - ray tracing - environment mapping - refraction

lecture 18 - ray tracing - environment mapping - refraction lecture 18 - ray tracing - environment mapping - refraction Recall Ray Casting (lectures 7, 8) for each pixel (x,y) { cast a ray through that pixel into the scene, and find the closest surface along the

More information

Perceptual Effects in Real-time Tone Mapping

Perceptual Effects in Real-time Tone Mapping Perceptual Effects in Real-time Tone Mapping G. Krawczyk K. Myszkowski H.-P. Seidel Max-Planck-Institute für Informatik Saarbrücken, Germany SCCG 2005 High Dynamic Range (HDR) HDR Imaging Display of HDR

More information

Image Based Lighting with Near Light Sources

Image Based Lighting with Near Light Sources Image Based Lighting with Near Light Sources Shiho Furuya, Takayuki Itoh Graduate School of Humanitics and Sciences, Ochanomizu University E-mail: {shiho, itot}@itolab.is.ocha.ac.jp Abstract Recent some

More information

Image Based Lighting with Near Light Sources

Image Based Lighting with Near Light Sources Image Based Lighting with Near Light Sources Shiho Furuya, Takayuki Itoh Graduate School of Humanitics and Sciences, Ochanomizu University E-mail: {shiho, itot}@itolab.is.ocha.ac.jp Abstract Recent some

More information

Lab 9 - Metal and Glass

Lab 9 - Metal and Glass Lab 9 - Metal and Glass Let the form of an object be what it may, light, shade, and perspective will always make it beautiful. -John Constable Prologue Support code: /course/cs1230/src/labs/lab09 This

More information

A New Image Based Ligthing Method: Practical Shadow-Based Light Reconstruction

A New Image Based Ligthing Method: Practical Shadow-Based Light Reconstruction A New Image Based Ligthing Method: Practical Shadow-Based Light Reconstruction Jaemin Lee and Ergun Akleman Visualization Sciences Program Texas A&M University Abstract In this paper we present a practical

More information

Rendering. Converting a 3D scene to a 2D image. Camera. Light. Rendering. View Plane

Rendering. Converting a 3D scene to a 2D image. Camera. Light. Rendering. View Plane Rendering Pipeline Rendering Converting a 3D scene to a 2D image Rendering Light Camera 3D Model View Plane Rendering Converting a 3D scene to a 2D image Basic rendering tasks: Modeling: creating the world

More information

Interactive Rendering of Globally Illuminated Glossy Scenes

Interactive Rendering of Globally Illuminated Glossy Scenes Interactive Rendering of Globally Illuminated Glossy Scenes Wolfgang Stürzlinger, Rui Bastos Dept. of Computer Science, University of North Carolina at Chapel Hill {stuerzl bastos}@cs.unc.edu Abstract.

More information

INFOGR Computer Graphics. J. Bikker - April-July Lecture 10: Ground Truth. Welcome!

INFOGR Computer Graphics. J. Bikker - April-July Lecture 10: Ground Truth. Welcome! INFOGR Computer Graphics J. Bikker - April-July 2015 - Lecture 10: Ground Truth Welcome! Today s Agenda: Limitations of Whitted-style Ray Tracing Monte Carlo Path Tracing INFOGR Lecture 10 Ground Truth

More information

Efficient Rendering of Glossy Reflection Using Graphics Hardware

Efficient Rendering of Glossy Reflection Using Graphics Hardware Efficient Rendering of Glossy Reflection Using Graphics Hardware Yoshinori Dobashi Yuki Yamada Tsuyoshi Yamamoto Hokkaido University Kita-ku Kita 14, Nishi 9, Sapporo 060-0814, Japan Phone: +81.11.706.6530,

More information

Local Illumination. CMPT 361 Introduction to Computer Graphics Torsten Möller. Machiraju/Zhang/Möller

Local Illumination. CMPT 361 Introduction to Computer Graphics Torsten Möller. Machiraju/Zhang/Möller Local Illumination CMPT 361 Introduction to Computer Graphics Torsten Möller Graphics Pipeline Hardware Modelling Transform Visibility Illumination + Shading Perception, Interaction Color Texture/ Realism

More information

Ray tracing based fast refraction method for an object seen through a cylindrical glass

Ray tracing based fast refraction method for an object seen through a cylindrical glass 20th International Congress on Modelling and Simulation, Adelaide, Australia, 1 6 December 2013 www.mssanz.org.au/modsim2013 Ray tracing based fast refraction method for an object seen through a cylindrical

More information

CS 354R: Computer Game Technology

CS 354R: Computer Game Technology CS 354R: Computer Game Technology Texture and Environment Maps Fall 2018 Texture Mapping Problem: colors, normals, etc. are only specified at vertices How do we add detail between vertices without incurring

More information

CS427 Multicore Architecture and Parallel Computing

CS427 Multicore Architecture and Parallel Computing CS427 Multicore Architecture and Parallel Computing Lecture 6 GPU Architecture Li Jiang 2014/10/9 1 GPU Scaling A quiet revolution and potential build-up Calculation: 936 GFLOPS vs. 102 GFLOPS Memory Bandwidth:

More information

Recollection. Models Pixels. Model transformation Viewport transformation Clipping Rasterization Texturing + Lights & shadows

Recollection. Models Pixels. Model transformation Viewport transformation Clipping Rasterization Texturing + Lights & shadows Recollection Models Pixels Model transformation Viewport transformation Clipping Rasterization Texturing + Lights & shadows Can be computed in different stages 1 So far we came to Geometry model 3 Surface

More information

03 RENDERING PART TWO

03 RENDERING PART TWO 03 RENDERING PART TWO WHAT WE HAVE SO FAR: GEOMETRY AFTER TRANSFORMATION AND SOME BASIC CLIPPING / CULLING TEXTURES AND MAPPING MATERIAL VISUALLY DISTINGUISHES 2 OBJECTS WITH IDENTICAL GEOMETRY FOR NOW,

More information

High Dynamic Range Imaging.

High Dynamic Range Imaging. High Dynamic Range Imaging High Dynamic Range [3] In photography, dynamic range (DR) is measured in exposure value (EV) differences or stops, between the brightest and darkest parts of the image that show

More information

Pipeline Operations. CS 4620 Lecture 14

Pipeline Operations. CS 4620 Lecture 14 Pipeline Operations CS 4620 Lecture 14 2014 Steve Marschner 1 Pipeline you are here APPLICATION COMMAND STREAM 3D transformations; shading VERTEX PROCESSING TRANSFORMED GEOMETRY conversion of primitives

More information

Real-Time High-Dynamic Range Texture Mapping

Real-Time High-Dynamic Range Texture Mapping Real-Time High-Dynamic Range Texture Mapping Jonathan Cohen, Chris Tchou, Tim Hawkins, and Paul Debevec University of Southern California Institute for Creative Technologies 13274 Fiji Way, Marina del

More information

CS451Real-time Rendering Pipeline

CS451Real-time Rendering Pipeline 1 CS451Real-time Rendering Pipeline JYH-MING LIEN DEPARTMENT OF COMPUTER SCIENCE GEORGE MASON UNIVERSITY Based on Tomas Akenine-Möller s lecture note You say that you render a 3D 2 scene, but what does

More information

Rendering Algorithms: Real-time indirect illumination. Spring 2010 Matthias Zwicker

Rendering Algorithms: Real-time indirect illumination. Spring 2010 Matthias Zwicker Rendering Algorithms: Real-time indirect illumination Spring 2010 Matthias Zwicker Today Real-time indirect illumination Ray tracing vs. Rasterization Screen space techniques Visibility & shadows Instant

More information

Complex Shading Algorithms

Complex Shading Algorithms Complex Shading Algorithms CPSC 414 Overview So far Rendering Pipeline including recent developments Today Shading algorithms based on the Rendering Pipeline Arbitrary reflection models (BRDFs) Bump mapping

More information

COMP environment mapping Mar. 12, r = 2n(n v) v

COMP environment mapping Mar. 12, r = 2n(n v) v Rendering mirror surfaces The next texture mapping method assumes we have a mirror surface, or at least a reflectance function that contains a mirror component. Examples might be a car window or hood,

More information

High Dynamic Range Image Texture Mapping based on VRML

High Dynamic Range Image Texture Mapping based on VRML High Dynamic Range Image Texture Mapping based on VRML Sung-Ye Kim and Byoung-Tae Choi 3D Graphics Research Team, Virtual Reality Research and Development Department, Computer Software Research Laboratory,

More information

CSE 167: Lecture #7: Color and Shading. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2011

CSE 167: Lecture #7: Color and Shading. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2011 CSE 167: Introduction to Computer Graphics Lecture #7: Color and Shading Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2011 Announcements Homework project #3 due this Friday,

More information

Computer Graphics (CS 563) Lecture 4: Advanced Computer Graphics Image Based Effects: Part 2. Prof Emmanuel Agu

Computer Graphics (CS 563) Lecture 4: Advanced Computer Graphics Image Based Effects: Part 2. Prof Emmanuel Agu Computer Graphics (CS 563) Lecture 4: Advanced Computer Graphics Image Based Effects: Part 2 Prof Emmanuel Agu Computer Science Dept. Worcester Polytechnic Institute (WPI) Image Processing Graphics concerned

More information

Simple Lighting/Illumination Models

Simple Lighting/Illumination Models Simple Lighting/Illumination Models Scene rendered using direct lighting only Photograph Scene rendered using a physically-based global illumination model with manual tuning of colors (Frederic Drago and

More information

Pipeline Operations. CS 4620 Lecture Steve Marschner. Cornell CS4620 Spring 2018 Lecture 11

Pipeline Operations. CS 4620 Lecture Steve Marschner. Cornell CS4620 Spring 2018 Lecture 11 Pipeline Operations CS 4620 Lecture 11 1 Pipeline you are here APPLICATION COMMAND STREAM 3D transformations; shading VERTEX PROCESSING TRANSFORMED GEOMETRY conversion of primitives to pixels RASTERIZATION

More information

Advanced Deferred Rendering Techniques. NCCA, Thesis Portfolio Peter Smith

Advanced Deferred Rendering Techniques. NCCA, Thesis Portfolio Peter Smith Advanced Deferred Rendering Techniques NCCA, Thesis Portfolio Peter Smith August 2011 Abstract The following paper catalogues the improvements made to a Deferred Renderer created for an earlier NCCA project.

More information

CENG 477 Introduction to Computer Graphics. Ray Tracing: Shading

CENG 477 Introduction to Computer Graphics. Ray Tracing: Shading CENG 477 Introduction to Computer Graphics Ray Tracing: Shading Last Week Until now we learned: How to create the primary rays from the given camera and image plane parameters How to intersect these rays

More information

Consider a partially transparent object that is illuminated with two lights, one visible from each side of the object. Start with a ray from the eye

Consider a partially transparent object that is illuminated with two lights, one visible from each side of the object. Start with a ray from the eye Ray Tracing What was the rendering equation? Motivate & list the terms. Relate the rendering equation to forward ray tracing. Why is forward ray tracing not good for image formation? What is the difference

More information

Light. Properties of light. What is light? Today What is light? How do we measure it? How does light propagate? How does light interact with matter?

Light. Properties of light. What is light? Today What is light? How do we measure it? How does light propagate? How does light interact with matter? Light Properties of light Today What is light? How do we measure it? How does light propagate? How does light interact with matter? by Ted Adelson Readings Andrew Glassner, Principles of Digital Image

More information

The 7d plenoptic function, indexing all light.

The 7d plenoptic function, indexing all light. Previous Lecture The 7d plenoptic function, indexing all light. Lightfields: a 4d (not 5d!) data structure which captures all outgoing light from a region and permits reconstruction of arbitrary synthetic

More information

Fast HDR Image-Based Lighting Using Summed-Area Tables

Fast HDR Image-Based Lighting Using Summed-Area Tables Fast HDR Image-Based Lighting Using Summed-Area Tables Justin Hensley 1, Thorsten Scheuermann 2, Montek Singh 1 and Anselmo Lastra 1 1 University of North Carolina, Chapel Hill, NC, USA {hensley, montek,

More information

VGP352 Week 8. Agenda: Post-processing effects. Filter kernels Separable filters Depth of field HDR. 2-March-2010

VGP352 Week 8. Agenda: Post-processing effects. Filter kernels Separable filters Depth of field HDR. 2-March-2010 VGP352 Week 8 Agenda: Post-processing effects Filter kernels Separable filters Depth of field HDR Filter Kernels Can represent our filter operation as a sum of products over a region of pixels Each pixel

More information

CHAPTER 1 Graphics Systems and Models 3

CHAPTER 1 Graphics Systems and Models 3 ?????? 1 CHAPTER 1 Graphics Systems and Models 3 1.1 Applications of Computer Graphics 4 1.1.1 Display of Information............. 4 1.1.2 Design.................... 5 1.1.3 Simulation and Animation...........

More information

Advanced Real- Time Cel Shading Techniques in OpenGL Adam Hutchins Sean Kim

Advanced Real- Time Cel Shading Techniques in OpenGL Adam Hutchins Sean Kim Advanced Real- Time Cel Shading Techniques in OpenGL Adam Hutchins Sean Kim Cel shading, also known as toon shading, is a non- photorealistic rending technique that has been used in many animations and

More information

The Rasterization Pipeline

The Rasterization Pipeline Lecture 5: The Rasterization Pipeline Computer Graphics and Imaging UC Berkeley CS184/284A, Spring 2016 What We ve Covered So Far z x y z x y (0, 0) (w, h) Position objects and the camera in the world

More information

Practical Shadow Mapping

Practical Shadow Mapping Practical Shadow Mapping Stefan Brabec Thomas Annen Hans-Peter Seidel Max-Planck-Institut für Informatik Saarbrücken, Germany Abstract In this paper we propose several methods that can greatly improve

More information

Scalable multi-gpu cloud raytracing with OpenGL

Scalable multi-gpu cloud raytracing with OpenGL Scalable multi-gpu cloud raytracing with OpenGL University of Žilina Digital technologies 2014, Žilina, Slovakia Overview Goals Rendering distant details in visualizations Raytracing Multi-GPU programming

More information

Screen Space Ambient Occlusion TSBK03: Advanced Game Programming

Screen Space Ambient Occlusion TSBK03: Advanced Game Programming Screen Space Ambient Occlusion TSBK03: Advanced Game Programming August Nam-Ki Ek, Oscar Johnson and Ramin Assadi March 5, 2015 This project report discusses our approach of implementing Screen Space Ambient

More information

Abstract. Introduction. Kevin Todisco

Abstract. Introduction. Kevin Todisco - Kevin Todisco Figure 1: A large scale example of the simulation. The leftmost image shows the beginning of the test case, and shows how the fluid refracts the environment around it. The middle image

More information

Ray Optics. Ray model Reflection Refraction, total internal reflection Color dispersion Lenses Image formation Magnification Spherical mirrors

Ray Optics. Ray model Reflection Refraction, total internal reflection Color dispersion Lenses Image formation Magnification Spherical mirrors Ray Optics Ray model Reflection Refraction, total internal reflection Color dispersion Lenses Image formation Magnification Spherical mirrors 1 Ray optics Optical imaging and color in medicine Integral

More information

Computer Graphics. Lecture 14 Bump-mapping, Global Illumination (1)

Computer Graphics. Lecture 14 Bump-mapping, Global Illumination (1) Computer Graphics Lecture 14 Bump-mapping, Global Illumination (1) Today - Bump mapping - Displacement mapping - Global Illumination Radiosity Bump Mapping - A method to increase the realism of 3D objects

More information

Computer Graphics (CS 543) Lecture 10: Normal Maps, Parametrization, Tone Mapping

Computer Graphics (CS 543) Lecture 10: Normal Maps, Parametrization, Tone Mapping Computer Graphics (CS 543) Lecture 10: Normal Maps, Parametrization, Tone Mapping Prof Emmanuel Agu Computer Science Dept. Worcester Polytechnic Institute (WPI) Normal Mapping Store normals in texture

More information

CS 4620 Program 3: Pipeline

CS 4620 Program 3: Pipeline CS 4620 Program 3: Pipeline out: Wednesday 14 October 2009 due: Friday 30 October 2009 1 Introduction In this assignment, you will implement several types of shading in a simple software graphics pipeline.

More information

I have a meeting with Peter Lee and Bob Cosgrove on Wednesday to discuss the future of the cluster. Computer Graphics

I have a meeting with Peter Lee and Bob Cosgrove on Wednesday to discuss the future of the cluster. Computer Graphics Announcements Assignment 4 will be out later today Problem Set 3 is due today or tomorrow by 9am in my mail box (4 th floor NSH) How are the machines working out? I have a meeting with Peter Lee and Bob

More information

Computer Vision 2. SS 18 Dr. Benjamin Guthier Professur für Bildverarbeitung. Computer Vision 2 Dr. Benjamin Guthier

Computer Vision 2. SS 18 Dr. Benjamin Guthier Professur für Bildverarbeitung. Computer Vision 2 Dr. Benjamin Guthier Computer Vision 2 SS 18 Dr. Benjamin Guthier Professur für Bildverarbeitung Computer Vision 2 Dr. Benjamin Guthier 3. HIGH DYNAMIC RANGE Computer Vision 2 Dr. Benjamin Guthier Pixel Value Content of this

More information

We present a method to accelerate global illumination computation in pre-rendered animations

We present a method to accelerate global illumination computation in pre-rendered animations Attention for Computer Graphics Rendering Hector Yee PDI / DreamWorks Sumanta Pattanaik University of Central Florida Corresponding Author: Hector Yee Research and Development PDI / DreamWorks 1800 Seaport

More information

Computer Graphics Lecture 11

Computer Graphics Lecture 11 1 / 14 Computer Graphics Lecture 11 Dr. Marc Eduard Frîncu West University of Timisoara May 15th 2012 2 / 14 Outline 1 Introduction 2 Transparency 3 Reflection 4 Recap 3 / 14 Introduction light = local

More information

CSE 167: Introduction to Computer Graphics Lecture #6: Colors. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2013

CSE 167: Introduction to Computer Graphics Lecture #6: Colors. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2013 CSE 167: Introduction to Computer Graphics Lecture #6: Colors Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2013 Announcements Homework project #3 due this Friday, October 18

More information

Supplement to Lecture 16

Supplement to Lecture 16 Supplement to Lecture 16 Global Illumination: View Dependent CS 354 Computer Graphics http://www.cs.utexas.edu/~bajaj/ Notes and figures from Ed Angel: Interactive Computer Graphics, 6 th Ed., 2012 Addison

More information

Virtual Reality for Human Computer Interaction

Virtual Reality for Human Computer Interaction Virtual Reality for Human Computer Interaction Appearance: Lighting Representation of Light and Color Do we need to represent all I! to represent a color C(I)? No we can approximate using a three-color

More information

x ~ Hemispheric Lighting

x ~ Hemispheric Lighting Irradiance and Incoming Radiance Imagine a sensor which is a small, flat plane centered at a point ~ x in space and oriented so that its normal points in the direction n. This sensor can compute the total

More information

Lecture 17: Recursive Ray Tracing. Where is the way where light dwelleth? Job 38:19

Lecture 17: Recursive Ray Tracing. Where is the way where light dwelleth? Job 38:19 Lecture 17: Recursive Ray Tracing Where is the way where light dwelleth? Job 38:19 1. Raster Graphics Typical graphics terminals today are raster displays. A raster display renders a picture scan line

More information

Greg Ward / SIGGRAPH 2003

Greg Ward / SIGGRAPH 2003 Global Illumination Global Illumination & HDRI Formats Greg Ward Anyhere Software Accounts for most (if not all) visible light interactions Goal may be to maximize realism, but more often visual reproduction

More information

Homework #2. Shading, Ray Tracing, and Texture Mapping

Homework #2. Shading, Ray Tracing, and Texture Mapping Computer Graphics Prof. Brian Curless CSE 457 Spring 2000 Homework #2 Shading, Ray Tracing, and Texture Mapping Prepared by: Doug Johnson, Maya Widyasari, and Brian Curless Assigned: Monday, May 8, 2000

More information

Refraction of Light. This bending of the ray is called refraction

Refraction of Light. This bending of the ray is called refraction Refraction & Lenses Refraction of Light When a ray of light traveling through a transparent medium encounters a boundary leading into another transparent medium, part of the ray is reflected and part of

More information

Applications of Explicit Early-Z Culling

Applications of Explicit Early-Z Culling Applications of Explicit Early-Z Culling Jason L. Mitchell ATI Research Pedro V. Sander ATI Research Introduction In past years, in the SIGGRAPH Real-Time Shading course, we have covered the details of

More information

OpenGl Pipeline. triangles, lines, points, images. Per-vertex ops. Primitive assembly. Texturing. Rasterization. Per-fragment ops.

OpenGl Pipeline. triangles, lines, points, images. Per-vertex ops. Primitive assembly. Texturing. Rasterization. Per-fragment ops. OpenGl Pipeline Individual Vertices Transformed Vertices Commands Processor Per-vertex ops Primitive assembly triangles, lines, points, images Primitives Fragments Rasterization Texturing Per-fragment

More information

CS 325 Computer Graphics

CS 325 Computer Graphics CS 325 Computer Graphics 04 / 02 / 2012 Instructor: Michael Eckmann Today s Topics Questions? Comments? Illumination modelling Ambient, Diffuse, Specular Reflection Surface Rendering / Shading models Flat

More information

C P S C 314 S H A D E R S, O P E N G L, & J S RENDERING PIPELINE. Mikhail Bessmeltsev

C P S C 314 S H A D E R S, O P E N G L, & J S RENDERING PIPELINE. Mikhail Bessmeltsev C P S C 314 S H A D E R S, O P E N G L, & J S RENDERING PIPELINE UGRAD.CS.UBC.C A/~CS314 Mikhail Bessmeltsev 1 WHAT IS RENDERING? Generating image from a 3D scene 2 WHAT IS RENDERING? Generating image

More information

Real-Time Shadows. Last Time? Textures can Alias. Schedule. Questions? Quiz 1: Tuesday October 26 th, in class (1 week from today!

Real-Time Shadows. Last Time? Textures can Alias. Schedule. Questions? Quiz 1: Tuesday October 26 th, in class (1 week from today! Last Time? Real-Time Shadows Perspective-Correct Interpolation Texture Coordinates Procedural Solid Textures Other Mapping Bump Displacement Environment Lighting Textures can Alias Aliasing is the under-sampling

More information

Computergrafik. Matthias Zwicker. Herbst 2010

Computergrafik. Matthias Zwicker. Herbst 2010 Computergrafik Matthias Zwicker Universität Bern Herbst 2010 Today Bump mapping Shadows Shadow mapping Shadow mapping in OpenGL Bump mapping Surface detail is often the result of small perturbations in

More information

Lets assume each object has a defined colour. Hence our illumination model is looks unrealistic.

Lets assume each object has a defined colour. Hence our illumination model is looks unrealistic. Shading Models There are two main types of rendering that we cover, polygon rendering ray tracing Polygon rendering is used to apply illumination models to polygons, whereas ray tracing applies to arbitrary

More information

Ray Tracing. Kjetil Babington

Ray Tracing. Kjetil Babington Ray Tracing Kjetil Babington 21.10.2011 1 Introduction What is Ray Tracing? Act of tracing a ray through some scene Not necessarily for rendering Rendering with Ray Tracing Ray Tracing is a global illumination

More information

(Equation 24.1: Index of refraction) We can make sense of what happens in Figure 24.1

(Equation 24.1: Index of refraction) We can make sense of what happens in Figure 24.1 24-1 Refraction To understand what happens when light passes from one medium to another, we again use a model that involves rays and wave fronts, as we did with reflection. Let s begin by creating a short

More information

CS452/552; EE465/505. Intro to Lighting

CS452/552; EE465/505. Intro to Lighting CS452/552; EE465/505 Intro to Lighting 2-10 15 Outline! Projection Normalization! Introduction to Lighting (and Shading) Read: Angel Chapter 5., sections 5.4-5.7 Parallel Projections Chapter 6, sections

More information

REAL-TIME RENDERING OF HIGH QUALITY GLARE IMAGES USING VERTEX TEXTURE FETCH ON GPU

REAL-TIME RENDERING OF HIGH QUALITY GLARE IMAGES USING VERTEX TEXTURE FETCH ON GPU REAL-TIME REDERIG OF HIGH QUALITY GLARE IMAGES USIG VERTEX TEXTURE FETCH O GPU Hidetoshi Ando, obutaka Torigoe Department of Computer and Media, Graduate School of University of Yamanashi, Takeda 4-3-11,

More information

Ø Sampling Theory" Ø Fourier Analysis Ø Anti-aliasing Ø Supersampling Strategies" Ø The Hall illumination model. Ø Original ray tracing paper

Ø Sampling Theory Ø Fourier Analysis Ø Anti-aliasing Ø Supersampling Strategies Ø The Hall illumination model. Ø Original ray tracing paper CS 431/636 Advanced Rendering Techniques Ø Dr. David Breen Ø Korman 105D Ø Wednesday 6PM 8:50PM Presentation 6 5/16/12 Questions from ast Time? Ø Sampling Theory" Ø Fourier Analysis Ø Anti-aliasing Ø Supersampling

More information

Orthogonal Projection Matrices. Angel and Shreiner: Interactive Computer Graphics 7E Addison-Wesley 2015

Orthogonal Projection Matrices. Angel and Shreiner: Interactive Computer Graphics 7E Addison-Wesley 2015 Orthogonal Projection Matrices 1 Objectives Derive the projection matrices used for standard orthogonal projections Introduce oblique projections Introduce projection normalization 2 Normalization Rather

More information

CS580: Ray Tracing. Sung-Eui Yoon ( 윤성의 ) Course URL:

CS580: Ray Tracing. Sung-Eui Yoon ( 윤성의 ) Course URL: CS580: Ray Tracing Sung-Eui Yoon ( 윤성의 ) Course URL: http://sglab.kaist.ac.kr/~sungeui/gcg/ Recursive Ray Casting Gained popularity in when Turner Whitted (1980) recognized that recursive ray casting could

More information

Tutorial on GPU Programming #2. Joong-Youn Lee Supercomputing Center, KISTI

Tutorial on GPU Programming #2. Joong-Youn Lee Supercomputing Center, KISTI Tutorial on GPU Programming #2 Joong-Youn Lee Supercomputing Center, KISTI Contents Graphics Pipeline Vertex Programming Fragment Programming Introduction to Cg Language Graphics Pipeline The process to

More information

Deferred Rendering Due: Wednesday November 15 at 10pm

Deferred Rendering Due: Wednesday November 15 at 10pm CMSC 23700 Autumn 2017 Introduction to Computer Graphics Project 4 November 2, 2017 Deferred Rendering Due: Wednesday November 15 at 10pm 1 Summary This assignment uses the same application architecture

More information

Ray Tracing Effects without Tracing Rays

Ray Tracing Effects without Tracing Rays Introduction Ray Tracing Effects without Tracing Rays László Szirmay-Kalos, Barnabás Aszódi, and István Lazányi Budapest University of Technology and Economics, Hungary The basic operation of rendering

More information

Mali Demos: Behind the Pixels. Stacy Smith

Mali Demos: Behind the Pixels. Stacy Smith Mali Demos: Behind the Pixels Stacy Smith Mali Graphics: Behind the demos Mali Demo Team: Doug Day Stacy Smith (Me) Sylwester Bala Roberto Lopez Mendez PHOTOGRAPH UNAVAILABLE These days I spend more time

More information

CPSC 314 LIGHTING AND SHADING

CPSC 314 LIGHTING AND SHADING CPSC 314 LIGHTING AND SHADING UGRAD.CS.UBC.CA/~CS314 slide credits: Mikhail Bessmeltsev et al 1 THE RENDERING PIPELINE Vertices and attributes Vertex Shader Modelview transform Per-vertex attributes Vertex

More information

GUERRILLA DEVELOP CONFERENCE JULY 07 BRIGHTON

GUERRILLA DEVELOP CONFERENCE JULY 07 BRIGHTON Deferred Rendering in Killzone 2 Michal Valient Senior Programmer, Guerrilla Talk Outline Forward & Deferred Rendering Overview G-Buffer Layout Shader Creation Deferred Rendering in Detail Rendering Passes

More information

TSBK03 Screen-Space Ambient Occlusion

TSBK03 Screen-Space Ambient Occlusion TSBK03 Screen-Space Ambient Occlusion Joakim Gebart, Jimmy Liikala December 15, 2013 Contents 1 Abstract 1 2 History 2 2.1 Crysis method..................................... 2 3 Chosen method 2 3.1 Algorithm

More information

CS : Assignment 2 Real-Time / Image-Based Rendering

CS : Assignment 2 Real-Time / Image-Based Rendering CS 294-13: Assignment 2 Real-Time / Image-Based Rendering Ravi Ramamoorthi 1 Introduction In this assignment, you will implement some of the modern techniques for real-time and/or image-based rendering.

More information

NVIDIA Case Studies:

NVIDIA Case Studies: NVIDIA Case Studies: OptiX & Image Space Photon Mapping David Luebke NVIDIA Research Beyond Programmable Shading 0 How Far Beyond? The continuum Beyond Programmable Shading Just programmable shading: DX,

More information

Ray Tracing Assignment. Ray Tracing Assignment. Ray Tracing Assignment. Tone Reproduction. Checkpoint 7. So You Want to Write a Ray Tracer

Ray Tracing Assignment. Ray Tracing Assignment. Ray Tracing Assignment. Tone Reproduction. Checkpoint 7. So You Want to Write a Ray Tracer Ray Tracing Assignment So You Want to Write a Ray Tracer Goal is to reproduce the following Checkpoint 7 Whitted, 1980 Ray Tracing Assignment Seven checkpoints 1. Setting the Scene 2. Camera Modeling 3.

More information

Global Illumination CS334. Daniel G. Aliaga Department of Computer Science Purdue University

Global Illumination CS334. Daniel G. Aliaga Department of Computer Science Purdue University Global Illumination CS334 Daniel G. Aliaga Department of Computer Science Purdue University Recall: Lighting and Shading Light sources Point light Models an omnidirectional light source (e.g., a bulb)

More information

Recap of Previous Lecture

Recap of Previous Lecture Recap of Previous Lecture Matting foreground from background Using a single known background (and a constrained foreground) Using two known backgrounds Using lots of backgrounds to capture reflection and

More information

Today. Global illumination. Shading. Interactive applications. Rendering pipeline. Computergrafik. Shading Introduction Local shading models

Today. Global illumination. Shading. Interactive applications. Rendering pipeline. Computergrafik. Shading Introduction Local shading models Computergrafik Matthias Zwicker Universität Bern Herbst 2009 Today Introduction Local shading models Light sources strategies Compute interaction of light with surfaces Requires simulation of physics Global

More information

LEVEL 1 ANIMATION ACADEMY2010

LEVEL 1 ANIMATION ACADEMY2010 1 Textures add more realism to an environment and characters. There are many 2D painting programs that can be used to create textures, such as Adobe Photoshop and Corel Painter. Many artists use photographs

More information

Recap: Refraction. Amount of bending depends on: - angle of incidence - refractive index of medium. (n 2 > n 1 ) n 2

Recap: Refraction. Amount of bending depends on: - angle of incidence - refractive index of medium. (n 2 > n 1 ) n 2 Amount of bending depends on: - angle of incidence - refractive index of medium Recap: Refraction λ 1 (n 2 > n 1 ) Snell s Law: When light passes from one transparent medium to another, the rays will be

More information

CS452/552; EE465/505. Clipping & Scan Conversion

CS452/552; EE465/505. Clipping & Scan Conversion CS452/552; EE465/505 Clipping & Scan Conversion 3-31 15 Outline! From Geometry to Pixels: Overview Clipping (continued) Scan conversion Read: Angel, Chapter 8, 8.1-8.9 Project#1 due: this week Lab4 due:

More information

Chapter 32 Light: Reflection and Refraction. Copyright 2009 Pearson Education, Inc.

Chapter 32 Light: Reflection and Refraction. Copyright 2009 Pearson Education, Inc. Chapter 32 Light: Reflection and Refraction Units of Chapter 32 The Ray Model of Light Reflection; Image Formation by a Plane Mirror Formation of Images by Spherical Mirrors Index of Refraction Refraction:

More information

Part Images Formed by Flat Mirrors. This Chapter. Phys. 281B Geometric Optics. Chapter 2 : Image Formation. Chapter 2: Image Formation

Part Images Formed by Flat Mirrors. This Chapter. Phys. 281B Geometric Optics. Chapter 2 : Image Formation. Chapter 2: Image Formation Phys. 281B Geometric Optics This Chapter 3 Physics Department Yarmouk University 21163 Irbid Jordan 1- Images Formed by Flat Mirrors 2- Images Formed by Spherical Mirrors 3- Images Formed by Refraction

More information

Chapter 26 Geometrical Optics

Chapter 26 Geometrical Optics Chapter 26 Geometrical Optics 26.1 The Reflection of Light 26.2 Forming Images With a Plane Mirror 26.3 Spherical Mirrors 26.4 Ray Tracing and the Mirror Equation 26.5 The Refraction of Light 26.6 Ray

More information

Journal of Universal Computer Science, vol. 14, no. 14 (2008), submitted: 30/9/07, accepted: 30/4/08, appeared: 28/7/08 J.

Journal of Universal Computer Science, vol. 14, no. 14 (2008), submitted: 30/9/07, accepted: 30/4/08, appeared: 28/7/08 J. Journal of Universal Computer Science, vol. 14, no. 14 (2008), 2416-2427 submitted: 30/9/07, accepted: 30/4/08, appeared: 28/7/08 J.UCS Tabu Search on GPU Adam Janiak (Institute of Computer Engineering

More information

The Rasterization Pipeline

The Rasterization Pipeline Lecture 5: The Rasterization Pipeline (and its implementation on GPUs) Computer Graphics CMU 15-462/15-662, Fall 2015 What you know how to do (at this point in the course) y y z x (w, h) z x Position objects

More information

Game Technology. Lecture Physically Based Rendering. Dipl-Inform. Robert Konrad Polona Caserman, M.Sc.

Game Technology. Lecture Physically Based Rendering. Dipl-Inform. Robert Konrad Polona Caserman, M.Sc. Game Technology Lecture 7 4.12.2017 Physically Based Rendering Dipl-Inform. Robert Konrad Polona Caserman, M.Sc. Prof. Dr.-Ing. Ralf Steinmetz KOM - Multimedia Communications Lab PPT-for-all v.3.4_office2010

More information