Rendering Hair with Back-lighting 1

Size: px
Start display at page:

Download "Rendering Hair with Back-lighting 1"

Transcription

1 Rendering Hair with Back-lighting 1 Tzong-Jer Yang, Ming Ouhyoung Communications & Multimedia Laboratory, Dept. of Computer Science and Information Engineering, National Taiwan University, Taiwan, ROC Abstract This paper presents a technique to reproduce human hair s luster with back-lighting, which is most noticeable for light-colored hair. This technique uses an auxiliary buffer, called a density map, to calculate the intensity of light passing through overlapped hairs. The density map for each light source is created by extracting the color values from the frame buffer after models are rendered from the light source s point of view with pixel blending. This technique can be applied to existing hair rendering pipelines with less than 20 percent computational overhead. 1. Introduction Traditionally hair rendering is one of the most challenging issues in computer graphics. The challenge comes mainly from large number of individual hairs. To render human hair realistically, several researchers have published different approaches, and have obtained acceptable results. However, human hair s shining under back-lighting conditions did not receive much attention. In fact, a back-lighting effect adds a crucial bit of realism to hair rendering, because it is quite common in natural scenes and makes a strong impression to people. Without a technique to simulate the back-lighting effect, animators will need to avoid putting an actor against light sources, which severely limits their artistic set up. This paper presents a technique that can simulate the back-lighting effect through the use of an auxiliary buffer, called a density map. The density map utilizes pixel blending that is now widely available on graphics hardware or libraries. Rather than recording the degrees of shadowing for shadow buffers, we record the degrees of translucency of overlapped hairs for density maps. In conjunction with shadow buffers, the density maps can help to calculate the intensity of back light piercing through a strand of hair more realistically. We will use the density map to represent human hair s back-lighting effect, especially for the light-colored hair, and show how it can be integrated into an existing hair rendering pipeline efficiently. 2. Previous researches In 1989, Kajiya and Kay proposed a rendering primitive, a texel, to deal with fine detailed scenes, and presented an impressive result for a teddy bear [1]. The texel works well on furry objects, but there is no clear method on how to apply it to long hairs. However, the anisotropic lighting model derived for hair is useful, and is continuously adopted by succeeding researchers and commercial products [2][3][4][7]. In Kajia s lighting model, the diffuse component is obtained by integrating a Lambert surface along a half cylinder, and the specular component is calculated from an ad hoc Phong specular model. A comprehensive hair rendering pipeline was proposed by LeBlanc and Daldegan et al. [3][4]. The hair rendering pipeline is based on pixel blending and shadow buffers, and provides a practical method to render anti-aliased hair with shadows for animation purposes. Anjyo et al. proposed another lighting model for fast rendering, however it is also rather restrictive [5]. The diffuse reflection component is neglected, and the specular reflection is obtained based on Blinn s specular model. Random numbers are introduced to distinguish hair strands. This model is relatively suitable for darkcolored hair that absorbs most of the diffuse component and shines with strong specular reflection [8]. During the same period, Watanabe and Suenaga tried to simulate hair s shining under back-lighting conditions 1 Web site:

2 [6]. They used a double Z-buffer mechanism to identify thin areas of hair, and assigned higher color intensities to pixels within these thin areas. The double Z-buffer mechanism makes use of two Z-buffers, and takes the difference between the two Z-buffers as the thickness. Figure 1 explains this double Z-buffer mechanism. Two Z-buffers are calculated first, where one is a front Z-buffer, and the other is a rear Z-buffer. Assume z i F and z i R are the depth values at position P on both the front and rear Z-buffers respectively, then the thickness of hair at position P equals z i F - z i R. This approach is straightforward, however it s not correct. z i F P thickness z i R a result, if we know that a pixel on the screen is overlapped with fewer hairs, we can assign it with a larger light intensity to reflect the back-lighting effect. To know hair thickness for each pixel, an intuitive idea is to record the number of hairs across a pixel during the rasterization stage of the rendering pipeline. We realize the idea using pixel blending. 3.1 Pixel blending In [10], Porter and Duff introduced an Alpha channel to composite images with anti-aliasing. A short description of this image composition methodology, or pixelblending, can refer to [3]. In summary, one can treat each pixel as a square box, and calculates the portions of objects inside the square box. The intensity of a pixel P n after blending in the nth objects is therefore determined by P = ( 1 A ) P 1 + A O, P0 = B (1) n n n n n Front Z-buffer Rear Z-buffer where A n is the fraction of pixel area covered by the nth object, O n is the color intensity of the nth object, and B is the color intensity of the background images. Figure 1. The double Z-buffer. The hair thickness at P is z i F - z i R. Consider the situations illustrated in Figure 2. The thickness of two distant hairs will be mistakenly larger than that of a cluster of hairs tied together. The result is not correct since the thickness of overlapped hairs is determined by the number of individual hairs, not by their distance. Consequently, the back-lighting effect will be visually inconsistent with human s perception. The pixel blending technique can be implemented by introducing an additional component for each pixel in the frame buffer. The additional component, or an alpha channel, is used to control how a new source pixel value is blended into an existing destination pixel. For a pixel, we can calculate the fraction of pixel area covered by an object, and assign the pixel s alpha channel with that fraction. The color of the pixel will be a blending of the object s color and the pixel s original color. Similarly, we may not need to calculate the fraction of covered pixel area actually, but assign it with a value explicitly. thicker thinner A good feature resulted from the pixel blending helps us to know hair thickness for each pixel. Examine equation (1) again. If one lets all A n equal A, and lets all O n equal O, one obtains Front Z-buffer (a) Rear Z-buffer Front Z-buffer (b) Rear Z-buffer Figure 2. Incorrect hair thickness calculated with the double Z-buffer mechanism. (a) Two hairs separated far away will be thicker than (b) a cluster of hairs tied together. 3. Back-lighting effects using density maps From daily observations, when human hair blocks a light source, the thinner parts of hair seem to shine. As P o = B P = ( 1 A ) P + AO = P + ( O P ) A P = ( 1 A ) P + AO = P + ( O P ) A n n 1 n 1 n 1 where 0 O, B, A 1, hence 0 P n 1 for all n. Furthermore, if one sets O to be 1, the largest value, then one will have P n P n-1 for all n since O P n for all n. The property of P n P n-1 for all n gives us a mechanism to know the thickness of hairs overlapped on a pixel. Figure 3 illustrates the condition of P n P n-1 for all n.

3 When more hairs intersect at a pixel, the intensity of the pixel gets larger. Consequently, we can use the intensity of a pixel as a ratio to compute the intensity of light passing through hairs. For example, if the intensity of a pixel is 0.8 and the intensity of light is 0.6, we can infer that the intensity of light reaching s is ( )*0.6 = If the intensity of the pixel changes to 0.2, the intensity of light reaching s will become ( )*0.6 = 0.48, which is exactly what we would expect when fewer hairs overlap, more light passes. more hairs intersect at this pixel more hairs intersect at this pixel be drawn with the largest intensity. While drawing the hair model, we enable pixel blending with a small alpha value, for example, 0.1. This can be done since the alpha value is set explicitly. The alpha value will cause the pixels occupied by the hair model to have intensities ranging from 0 to 1, which represents the degrees of translucency, and a higher intensity value corresponds to a lower translucency. A pixel s intensity may eventually reach 1.0 although there are still hairs to be overlapped on the pixel. This situation does not bother us since hair saturates after reaching a certain density, and overlapping more hairs on the pixel will completely obscure the background. The alpha value can also reflect hair s material. A higher alpha value will increase a pixel s intensity more rapidly, which means thicker hair or darker hair with a lower degree of translucency. The procedure of creating a density map for each light source is summarized in the following. a hair (a) a hair Figure 3. The intensity of a pixel becomes larger when more hairs intersect at the pixel. (a) A condition shows the distribution of hairs, while (b) shows the resulted intensities of pixels. The darker color means larger intensity. 3.2 Density map We now describe how we use pixel blending to create an auxiliary buffer, or a density map, for generating the back-lighting effect. We call it a density map because it contains density information for each pixel. The higher density for a pixel means more hairs intersecting at this pixel, and therefore attenuates more light. The density map for each light source is created by extracting the color values from the frame buffer after models are rendered from the light source s point of view. Assume that a color value ranges from 0 to 1, and the models are divided into two categories: a scene model and a hair model. First, we clear the frame buffer to be of the color value 0, which means that density values are all 0s for all pixels initially. Then both the scene model and the hair model are rendered with the color value 1, so we can have the property of P n P n-1 for all n if we control the alpha values to be the same for all pixels. (b) 1. Clear the frame buffer to be 0, the lowest intensity. This is usually done by setting color to be black for all pixels. 2. Disable lighting, and set the drawing color to be 1, the largest intensity. This is usually done by setting color to be white. 3. Take the scene model and project it using one light source. Render the scene model on the frame buffer with pixel blending disabled. 4. Take the hair model and project it using the same light source. Render the hair model on the same frame buffer with pixel blending enabled. The alpha value for pixel blending should be set to reflect hair s material attribute. 5. Extract the resulting frame buffer as the density map for the light source. 6. Extract the resulting depth buffer as the shadow buffer for the light source. 7. Repeat step 1 to step 6 for each light source. Therefore, we have one density map and one shadow buffer for each light source. 4. Integrate the density map into an existing hair rendering pipeline In [3], LeBlanc et al. proposed a comprehensive hair rendering pipeline that can render anti-aliased hair with shadows. The density map can be integrated into the hair rendering pipeline efficiently to render hair with back-lighting. While drawing the scene model, we should disable pixel blending because the scene model is opaque, and the disabling of pixel blending will cause the scene model to

4 4.1 A lighting model for back-lighting To add the back-lighting effect using the density map, we have to modify the following lighting model first. The lighting model adopted in [3] is n [ θ φ θ π ] H = LA K A+ S i Li K D sin( ) + K s cos ( + ) light overlapped hairs refracted reflected absorbed light overlapped hairs refracted light back-lighting viewing angle where H is the resulting hair intensity, L A is the ambient light intensity received, K A is the ambient reflectance coefficient, S i is the amount that ith light should be attenuated by shadowing, L i is the light intensity received, K D is the diffuse reflectance coefficient, K S is the specular reflectance coefficient. The angles φ and θ are defined as in Figure 4, where θ is the angle from the tangent vector of the hair strand to the light vector, and φ is the angle from the tangent vector of the hair strand to the vector. A detailed explanation can be found in [1][3]. ω e φ Figure 4. Hair lighting geometry, where l points to a light source, e points to s, r is the reflection of l, and t is the tangent vector of the hair strand. Our modified lighting model that adds the back-lighting effect is presented as t θ n [ sin( θ ) cos ( φ θ π )] H = L K + S L K + K + + A A i i D s ( 1 D ) L f ( ω ) i i o 1 if ω back-lighting viewing angle (160 in our case) f ( ω ) = 0 otherwise where (1-D i ) represents the fraction of the ith light intensity received by s. The value of D i is obtained from the density map. The ad hoc function f(ω) indicates that the back-lighting effect only happens when the s are against the light source, as shown in Figure 5. Where ω is the angle from the light vector to the vector, as shown in Figure 4. l r f(ω) = 1 the effective area of back-lighting Figure 5. A back-lighting effect takes place when the s are against the light source. The ad hoc function f(ω) is introduced to specify the effective area of back-lighting. In fact, the modified lighting model can be explained as H = the ambient light intensity + the intensity of reflected light + the intensity of refracted light. Although we simulate the refraction of light with an empirical model, the result is consistent with our visual perception. 4.2 Hair rendering pipeline with the back-lighting effect We now describe the modified hair rendering pipeline in the following. 1. Generate density maps and shadow buffers for all light sources. The procedure is described in Section Render the scene model with density maps and shadow buffers. 3. Take the hair model, and project all hair segments onto the viewing coordinate system. Sort all hair segments by their average depth. 4. Scan the hair segments in reverse depth order. For each segment, a. Determine the intensity H of each of the segment s endpoints, using the modified lighting model. b. Determine the fraction of pixel coverage for the line segment based on the width of hair and viewing projection. Set this as the line segment s alpha value. Note that the alpha value is unrelated to the one used in creating depth maps. c. Draw line segment as an alpha-blended line into the frame buffer, using linear color interpolation between the intensities of the endpoints.

5 (a) (b) (c) (d) (e) (f) Picture 1. (a) Rendered without back lighting. (b) Rendered with back lighting (alpha=0.1). (c) Rendered with back lighting (alpha=0.05). (d) The white line points to the light source; the black line points to the s. (e) The density map of (b). (f) The density map of (f). 5. Results We implemented the modified hair rendering pipeline on a Silicon Graphics Indigo 2 Extreme. This hardware model supports Z-buffer hidden surface removal, alpha blending and linear color interpolation. The hair models used are generated from a 3D hairstyle fitting system [9] and several in-house tools. Furthermore, the rendering resolution is 1024x768 with a 24-bit true color visual. 5.1 Effect of density map source is arranged against the s, which simulates a back-lighting environment, as shown in Picture 1(d). Picture 1(a) and 1(b) are resulted images without and with back-lighting calculation, respectively. Note that the shinning hairs in Picture 1(b) resulted from the back light. The alpha value used to create a density map represents hair s thickness. Picture 1(c) is an extreme case that sets the alpha value to 0.05, much thinner hair, while Picture 1(b) sets the alpha value to 0.1. The corresponding density maps for both Picture 1(b) and 1(c) are shown in Picture 1(e) and 1(f), respectively. Our first example, Picture 1, shows a simple hair model for illustrating the effect of density map only. One light

6 (a) (b) Picture 2. (a) The 3 white lines point to 3 light sources; the black line points to the s. (b) The density maps created from the 3 light sources. Without back-lighting With backlighting Computing buffers Light 1 Light 2 Light 3 Reading Computing Reading Computing buffers buffers buffers buffers Reading buffers Rasterization Total Table 1. The rendering time (in second) of a complex hair model shown in Picture 3 rendered with and without back-lighting. The total time includes some other computations that are not listed here. 5.2 Performance A hair model with a more complex hairstyle is rendered in Picture 3, and the configuration is shown in Picture 2(a). Again, we rendered the hair model without and with back-lighting effect, as presented in Picture 3 (left) and Picture 3 (right), respectively. Picture 2(b) shows the density maps created for Picture 3 (right). The hair model contains 18,700 hairs with a total of 490,700 line segments, and the rendering time is shown in Table 1. The computation of density maps takes an additional 18% of rendering time. 6. Conclusion be reproduced by introducing an auxiliary density map. The creation of the auxiliary density map takes a little more time since we can create it during the calculation of shadow buffers and the underlying technique, pixel blending, is widely supported by hardware. This solution is not necessary to be physically accurate, but coincide with human s visual perception. To simulate a natural realistic hair, modeling is a more important issue. During our experiments about the backlighting effect, we found that if a hair model appeared unnatural, the resulting luster of hair was also not satisfactory, even if we applied the back-lighting effect. Therefore, we spent a lot of time in improving hair models before rendering. We have demonstrated how the back lighting effect can

7 7. Acknowledgments We are grateful to Jasmine Yung-Huei Yan for her assistance in the modeling of hair. 8. References [1] James T. Kajiya, and Timothy L. Kay, Rendering Fur with Three Dimensional Textures, ACM Computer Graphics, 23(3), Boston, July 1989, pp [2] Robert E. Rosenblum, Wayne E. Carlson, and Edwin Tripp III, Simulating the Structure and Dynamics of Human hair: Modelling, Rendering and Animation, The Journal of Visualization and Computer Animation, 2, 1991, pp [3] Andre M. LeBlanc, Russell Turner, and Daniel Thalmann, Rendering Hair using Pixel Blending and Shadow Buffers, The Journal of Visualization and Computer Animation, 2, 1991, pp [4] Agnes Daldegan, Nedia Magnenat Thalmann, Tsuneya Kurihara, and Daniel Thalmann, An Integrated System for Modelling, Animating and Rendering Hair, EUROGRAPHICS, 12(3), 1993, pp. C-211-C-221. [5] Ken-ichi Anjyo, Yoshiaki Usami, and Tsuneya Kurihara, A Simple Method for Extracting the Natural Beauty of Hair, Computer Graphics, 26(2), Chicago, July 1992, pp [6] Yasuhiko Watanabe and Yasuhito Suenaga, A Trigonal Prism-Based Method for Hair Image Generation, IEEE Computer Graphics & Applications, January 1992, pp [7] Barbara Robertson, Hair-Raising Effects, Computer Graphics World, Oct [8] Clarence R. Robbins, Chemical and Physical Behavior of Human Hair, Van Nostrand Reinhold Co., [9] C. L. Liang, A 3D Hairstyle Fitting System for Hair Modeling, Rendering, and Facial Texture Mapping, Master thesis, Dept. of CSIE, National Taiwan University, Taiwan, [10] Thomas Porter, and Tom Duff, Compositing Digital Images, ACM Computer Graphics, 18(3), July 1984, pp

Visible Volume Buffer for Efficient Hair Expression and Shadow Generation

Visible Volume Buffer for Efficient Hair Expression and Shadow Generation Vol. 41 No. 3 Transactions of Information Processing Society of Japan Mar. 2000 Regular Paper Visible Volume Buffer for Efficient Hair Expression and Shadow Generation Waiming Kong and Masayuki Nakajima

More information

Rendering Hair using Pixel Blending and Shadow Buffers

Rendering Hair using Pixel Blending and Shadow Buffers Rendering Hair using Pixel Blending and Shadow Buffers ABSTRACT André M. LeBlanc Russell Turner Daniel Thalmann Computer Graphics Laboratory Swiss Federal Institute of Technology CH-1015, Lausanne, Switzerland

More information

Real-Time Hair Rendering on the GPU NVIDIA

Real-Time Hair Rendering on the GPU NVIDIA Real-Time Hair Rendering on the GPU Sarah Tariq NVIDIA Motivation Academia and the movie industry have been simulating and rendering impressive and realistic hair for a long time We have demonstrated realistic

More information

Local Illumination. CMPT 361 Introduction to Computer Graphics Torsten Möller. Machiraju/Zhang/Möller

Local Illumination. CMPT 361 Introduction to Computer Graphics Torsten Möller. Machiraju/Zhang/Möller Local Illumination CMPT 361 Introduction to Computer Graphics Torsten Möller Graphics Pipeline Hardware Modelling Transform Visibility Illumination + Shading Perception, Interaction Color Texture/ Realism

More information

Lecture 17: Recursive Ray Tracing. Where is the way where light dwelleth? Job 38:19

Lecture 17: Recursive Ray Tracing. Where is the way where light dwelleth? Job 38:19 Lecture 17: Recursive Ray Tracing Where is the way where light dwelleth? Job 38:19 1. Raster Graphics Typical graphics terminals today are raster displays. A raster display renders a picture scan line

More information

Modeling Hair and Fur with NURBS

Modeling Hair and Fur with NURBS Modeling Hair and Fur with URBS Anna Sokol ansokol@cs.sunysb.edu Computer Science Department SUY Stony Brook Abstract: This paper is presenting a framework for modeling hair and fur using URBS surfaces.

More information

Lets assume each object has a defined colour. Hence our illumination model is looks unrealistic.

Lets assume each object has a defined colour. Hence our illumination model is looks unrealistic. Shading Models There are two main types of rendering that we cover, polygon rendering ray tracing Polygon rendering is used to apply illumination models to polygons, whereas ray tracing applies to arbitrary

More information

Modeling Hair Movement with Mass-Springs

Modeling Hair Movement with Mass-Springs Modeling Hair Movement with Mass-Springs Anna Sokol ansokol@cs.sunysb.edu Computer Science Department SUY Stony Brook Abstract: This paper is presenting a framework for modeling hair movement using mass-springs.

More information

Introduction to Computer Graphics 7. Shading

Introduction to Computer Graphics 7. Shading Introduction to Computer Graphics 7. Shading National Chiao Tung Univ, Taiwan By: I-Chen Lin, Assistant Professor Textbook: Hearn and Baker, Computer Graphics, 3rd Ed., Prentice Hall Ref: E.Angel, Interactive

More information

Synthesis of Textures with Intricate Geometries using BTF and Large Number of Textured Micropolygons. Abstract. 2. Related studies. 1.

Synthesis of Textures with Intricate Geometries using BTF and Large Number of Textured Micropolygons. Abstract. 2. Related studies. 1. Synthesis of Textures with Intricate Geometries using BTF and Large Number of Textured Micropolygons sub047 Abstract BTF has been studied extensively and much progress has been done for measurements, compression

More information

Pipeline Operations. CS 4620 Lecture 10

Pipeline Operations. CS 4620 Lecture 10 Pipeline Operations CS 4620 Lecture 10 2008 Steve Marschner 1 Hidden surface elimination Goal is to figure out which color to make the pixels based on what s in front of what. Hidden surface elimination

More information

Problem Set 4 Part 1 CMSC 427 Distributed: Thursday, November 1, 2007 Due: Tuesday, November 20, 2007

Problem Set 4 Part 1 CMSC 427 Distributed: Thursday, November 1, 2007 Due: Tuesday, November 20, 2007 Problem Set 4 Part 1 CMSC 427 Distributed: Thursday, November 1, 2007 Due: Tuesday, November 20, 2007 Programming For this assignment you will write a simple ray tracer. It will be written in C++ without

More information

CS 488. More Shading and Illumination. Luc RENAMBOT

CS 488. More Shading and Illumination. Luc RENAMBOT CS 488 More Shading and Illumination Luc RENAMBOT 1 Illumination No Lighting Ambient model Light sources Diffuse reflection Specular reflection Model: ambient + specular + diffuse Shading: flat, gouraud,

More information

Pipeline Operations. CS 4620 Lecture 14

Pipeline Operations. CS 4620 Lecture 14 Pipeline Operations CS 4620 Lecture 14 2014 Steve Marschner 1 Pipeline you are here APPLICATION COMMAND STREAM 3D transformations; shading VERTEX PROCESSING TRANSFORMED GEOMETRY conversion of primitives

More information

Shading. Why we need shading. Scattering. Shading. Objectives

Shading. Why we need shading. Scattering. Shading. Objectives Shading Why we need shading Objectives Learn to shade objects so their images appear three-dimensional Suppose we build a model of a sphere using many polygons and color it with glcolor. We get something

More information

OpenGl Pipeline. triangles, lines, points, images. Per-vertex ops. Primitive assembly. Texturing. Rasterization. Per-fragment ops.

OpenGl Pipeline. triangles, lines, points, images. Per-vertex ops. Primitive assembly. Texturing. Rasterization. Per-fragment ops. OpenGl Pipeline Individual Vertices Transformed Vertices Commands Processor Per-vertex ops Primitive assembly triangles, lines, points, images Primitives Fragments Rasterization Texturing Per-fragment

More information

Real-Time Hair Simulation and Rendering on the GPU. Louis Bavoil

Real-Time Hair Simulation and Rendering on the GPU. Louis Bavoil Real-Time Hair Simulation and Rendering on the GPU Sarah Tariq Louis Bavoil Results 166 simulated strands 0.99 Million triangles Stationary: 64 fps Moving: 41 fps 8800GTX, 1920x1200, 8XMSAA Results 166

More information

CS 130 Final. Fall 2015

CS 130 Final. Fall 2015 CS 130 Final Fall 2015 Name Student ID Signature You may not ask any questions during the test. If you believe that there is something wrong with a question, write down what you think the question is trying

More information

Computer Vision Systems. Viewing Systems Projections Illuminations Rendering Culling and Clipping Implementations

Computer Vision Systems. Viewing Systems Projections Illuminations Rendering Culling and Clipping Implementations Computer Vision Systems Viewing Systems Projections Illuminations Rendering Culling and Clipping Implementations Viewing Systems Viewing Transformation Projective Transformation 2D Computer Graphics Devices

More information

Comp 410/510 Computer Graphics. Spring Shading

Comp 410/510 Computer Graphics. Spring Shading Comp 410/510 Computer Graphics Spring 2017 Shading Why we need shading Suppose we build a model of a sphere using many polygons and then color it using a fixed color. We get something like But we rather

More information

9. Illumination and Shading

9. Illumination and Shading 9. Illumination and Shading Approaches for visual realism: - Remove hidden surfaces - Shade visible surfaces and reproduce shadows - Reproduce surface properties Texture Degree of transparency Roughness,

More information

CS 464 Review. Review of Computer Graphics for Final Exam

CS 464 Review. Review of Computer Graphics for Final Exam CS 464 Review Review of Computer Graphics for Final Exam Goal: Draw 3D Scenes on Display Device 3D Scene Abstract Model Framebuffer Matrix of Screen Pixels In Computer Graphics: If it looks right then

More information

Deferred Rendering Due: Wednesday November 15 at 10pm

Deferred Rendering Due: Wednesday November 15 at 10pm CMSC 23700 Autumn 2017 Introduction to Computer Graphics Project 4 November 2, 2017 Deferred Rendering Due: Wednesday November 15 at 10pm 1 Summary This assignment uses the same application architecture

More information

CS 381 Computer Graphics, Fall 2008 Midterm Exam Solutions. The Midterm Exam was given in class on Thursday, October 23, 2008.

CS 381 Computer Graphics, Fall 2008 Midterm Exam Solutions. The Midterm Exam was given in class on Thursday, October 23, 2008. CS 381 Computer Graphics, Fall 2008 Midterm Exam Solutions The Midterm Exam was given in class on Thursday, October 23, 2008. 1. [4 pts] Drawing Where? Your instructor says that objects should always be

More information

Pipeline Operations. CS 4620 Lecture Steve Marschner. Cornell CS4620 Spring 2018 Lecture 11

Pipeline Operations. CS 4620 Lecture Steve Marschner. Cornell CS4620 Spring 2018 Lecture 11 Pipeline Operations CS 4620 Lecture 11 1 Pipeline you are here APPLICATION COMMAND STREAM 3D transformations; shading VERTEX PROCESSING TRANSFORMED GEOMETRY conversion of primitives to pixels RASTERIZATION

More information

Buffers, Textures, Compositing, and Blending. Overview. Buffers. David Carr Virtual Environments, Fundamentals Spring 2005 Based on Slides by E.

Buffers, Textures, Compositing, and Blending. Overview. Buffers. David Carr Virtual Environments, Fundamentals Spring 2005 Based on Slides by E. INSTITUTIONEN FÖR SYSTEMTEKNIK LULEÅ TEKNISKA UNIVERSITET Buffers, Textures, Compositing, and Blending David Carr Virtual Environments, Fundamentals Spring 2005 Based on Slides by E. Angel Compositing,

More information

Discrete Techniques. 11 th Week, Define a buffer by its spatial resolution (n m) and its depth (or precision) k, the number of

Discrete Techniques. 11 th Week, Define a buffer by its spatial resolution (n m) and its depth (or precision) k, the number of Discrete Techniques 11 th Week, 2010 Buffer Define a buffer by its spatial resolution (n m) and its depth (or precision) k, the number of bits/pixel Pixel OpenGL Frame Buffer OpenGL Buffers Color buffers

More information

Graphics for VEs. Ruth Aylett

Graphics for VEs. Ruth Aylett Graphics for VEs Ruth Aylett Overview VE Software Graphics for VEs The graphics pipeline Projections Lighting Shading VR software Two main types of software used: off-line authoring or modelling packages

More information

Surface Reflection Models

Surface Reflection Models Surface Reflection Models Frank Losasso (flosasso@nvidia.com) Introduction One of the fundamental topics in lighting is how the light interacts with the environment. The academic community has researched

More information

Rendering Hair-Like Objects with Indirect Illumination

Rendering Hair-Like Objects with Indirect Illumination Rendering Hair-Like Objects with Indirect Illumination CEM YUKSEL and ERGUN AKLEMAN Visualization Sciences Program, Department of Architecture Texas A&M University TR0501 - January 30th 2005 Our method

More information

Shading. Brian Curless CSE 557 Autumn 2017

Shading. Brian Curless CSE 557 Autumn 2017 Shading Brian Curless CSE 557 Autumn 2017 1 Reading Optional: Angel and Shreiner: chapter 5. Marschner and Shirley: chapter 10, chapter 17. Further reading: OpenGL red book, chapter 5. 2 Basic 3D graphics

More information

Shading I Computer Graphics I, Fall 2008

Shading I Computer Graphics I, Fall 2008 Shading I 1 Objectives Learn to shade objects ==> images appear threedimensional Introduce types of light-material interactions Build simple reflection model Phong model Can be used with real time graphics

More information

Computer Graphics. Shading. Based on slides by Dianna Xu, Bryn Mawr College

Computer Graphics. Shading. Based on slides by Dianna Xu, Bryn Mawr College Computer Graphics Shading Based on slides by Dianna Xu, Bryn Mawr College Image Synthesis and Shading Perception of 3D Objects Displays almost always 2 dimensional. Depth cues needed to restore the third

More information

Hair. Hair. Edit Panel

Hair. Hair. Edit Panel Hair Hair Hair is based on the Kajiya-Kay paper for hair shading in 3D. It is strictly meant for use with FiberFX's thin cylinders or ribbons. Edit Panel Color Group The base color of your rendered fibers.

More information

A Rotor Platform Assisted System for 3D Hairstyles

A Rotor Platform Assisted System for 3D Hairstyles A Rotor Platform Assisted System for 3D Hairstyles Chai-Ying Lee, Wei-Ru Chen, Eugenia Leu*, Ming Ouhyoung Dept. of Computer Science and Information Engineering, National Taiwan University, Taiwan {carollee,

More information

So far, we have considered only local models of illumination; they only account for incident light coming directly from the light sources.

So far, we have considered only local models of illumination; they only account for incident light coming directly from the light sources. 11 11.1 Basics So far, we have considered only local models of illumination; they only account for incident light coming directly from the light sources. Global models include incident light that arrives

More information

ECS 175 COMPUTER GRAPHICS. Ken Joy.! Winter 2014

ECS 175 COMPUTER GRAPHICS. Ken Joy.! Winter 2014 ECS 175 COMPUTER GRAPHICS Ken Joy Winter 2014 Shading To be able to model shading, we simplify Uniform Media no scattering of light Opaque Objects No Interreflection Point Light Sources RGB Color (eliminating

More information

Homework #2. Shading, Ray Tracing, and Texture Mapping

Homework #2. Shading, Ray Tracing, and Texture Mapping Computer Graphics Prof. Brian Curless CSE 457 Spring 2000 Homework #2 Shading, Ray Tracing, and Texture Mapping Prepared by: Doug Johnson, Maya Widyasari, and Brian Curless Assigned: Monday, May 8, 2000

More information

Texture. Texture Mapping. Texture Mapping. CS 475 / CS 675 Computer Graphics. Lecture 11 : Texture

Texture. Texture Mapping. Texture Mapping. CS 475 / CS 675 Computer Graphics. Lecture 11 : Texture Texture CS 475 / CS 675 Computer Graphics Add surface detail Paste a photograph over a surface to provide detail. Texture can change surface colour or modulate surface colour. Lecture 11 : Texture http://en.wikipedia.org/wiki/uv_mapping

More information

CS 475 / CS 675 Computer Graphics. Lecture 11 : Texture

CS 475 / CS 675 Computer Graphics. Lecture 11 : Texture CS 475 / CS 675 Computer Graphics Lecture 11 : Texture Texture Add surface detail Paste a photograph over a surface to provide detail. Texture can change surface colour or modulate surface colour. http://en.wikipedia.org/wiki/uv_mapping

More information

Global Illumination. CMPT 361 Introduction to Computer Graphics Torsten Möller. Machiraju/Zhang/Möller

Global Illumination. CMPT 361 Introduction to Computer Graphics Torsten Möller. Machiraju/Zhang/Möller Global Illumination CMPT 361 Introduction to Computer Graphics Torsten Möller Reading Foley, van Dam (better): Chapter 16.7-13 Angel: Chapter 5.11, 11.1-11.5 2 Limitation of local illumination A concrete

More information

4.5 VISIBLE SURFACE DETECTION METHODES

4.5 VISIBLE SURFACE DETECTION METHODES 4.5 VISIBLE SURFACE DETECTION METHODES A major consideration in the generation of realistic graphics displays is identifying those parts of a scene that are visible from a chosen viewing position. There

More information

Graphics for VEs. Ruth Aylett

Graphics for VEs. Ruth Aylett Graphics for VEs Ruth Aylett Overview VE Software Graphics for VEs The graphics pipeline Projections Lighting Shading Runtime VR systems Two major parts: initialisation and update loop. Initialisation

More information

Soft shadows. Steve Marschner Cornell University CS 569 Spring 2008, 21 February

Soft shadows. Steve Marschner Cornell University CS 569 Spring 2008, 21 February Soft shadows Steve Marschner Cornell University CS 569 Spring 2008, 21 February Soft shadows are what we normally see in the real world. If you are near a bare halogen bulb, a stage spotlight, or other

More information

521493S Computer Graphics. Exercise 3

521493S Computer Graphics. Exercise 3 521493S Computer Graphics Exercise 3 Question 3.1 Most graphics systems and APIs use the simple lighting and reflection models that we introduced for polygon rendering. Describe the ways in which each

More information

On the Visibility of the Shroud Image. Author: J. Dee German ABSTRACT

On the Visibility of the Shroud Image. Author: J. Dee German ABSTRACT On the Visibility of the Shroud Image Author: J. Dee German ABSTRACT During the 1978 STURP tests on the Shroud of Turin, experimenters observed an interesting phenomenon: the contrast between the image

More information

CS130 : Computer Graphics Lecture 8: Lighting and Shading. Tamar Shinar Computer Science & Engineering UC Riverside

CS130 : Computer Graphics Lecture 8: Lighting and Shading. Tamar Shinar Computer Science & Engineering UC Riverside CS130 : Computer Graphics Lecture 8: Lighting and Shading Tamar Shinar Computer Science & Engineering UC Riverside Why we need shading Suppose we build a model of a sphere using many polygons and color

More information

Lecture 15: Shading-I. CITS3003 Graphics & Animation

Lecture 15: Shading-I. CITS3003 Graphics & Animation Lecture 15: Shading-I CITS3003 Graphics & Animation E. Angel and D. Shreiner: Interactive Computer Graphics 6E Addison-Wesley 2012 Objectives Learn that with appropriate shading so objects appear as threedimensional

More information

Orthogonal Projection Matrices. Angel and Shreiner: Interactive Computer Graphics 7E Addison-Wesley 2015

Orthogonal Projection Matrices. Angel and Shreiner: Interactive Computer Graphics 7E Addison-Wesley 2015 Orthogonal Projection Matrices 1 Objectives Derive the projection matrices used for standard orthogonal projections Introduce oblique projections Introduce projection normalization 2 Normalization Rather

More information

Virtual Reality for Human Computer Interaction

Virtual Reality for Human Computer Interaction Virtual Reality for Human Computer Interaction Appearance: Lighting Representation of Light and Color Do we need to represent all I! to represent a color C(I)? No we can approximate using a three-color

More information

Efficient Rendering of Glossy Reflection Using Graphics Hardware

Efficient Rendering of Glossy Reflection Using Graphics Hardware Efficient Rendering of Glossy Reflection Using Graphics Hardware Yoshinori Dobashi Yuki Yamada Tsuyoshi Yamamoto Hokkaido University Kita-ku Kita 14, Nishi 9, Sapporo 060-0814, Japan Phone: +81.11.706.6530,

More information

Rendering and Radiosity. Introduction to Design Media Lecture 4 John Lee

Rendering and Radiosity. Introduction to Design Media Lecture 4 John Lee Rendering and Radiosity Introduction to Design Media Lecture 4 John Lee Overview Rendering is the process that creates an image from a model How is it done? How has it been developed? What are the issues

More information

Graphics Hardware and Display Devices

Graphics Hardware and Display Devices Graphics Hardware and Display Devices CSE328 Lectures Graphics/Visualization Hardware Many graphics/visualization algorithms can be implemented efficiently and inexpensively in hardware Facilitates interactive

More information

Today. Global illumination. Shading. Interactive applications. Rendering pipeline. Computergrafik. Shading Introduction Local shading models

Today. Global illumination. Shading. Interactive applications. Rendering pipeline. Computergrafik. Shading Introduction Local shading models Computergrafik Thomas Buchberger, Matthias Zwicker Universität Bern Herbst 2008 Today Introduction Local shading models Light sources strategies Compute interaction of light with surfaces Requires simulation

More information

CS451Real-time Rendering Pipeline

CS451Real-time Rendering Pipeline 1 CS451Real-time Rendering Pipeline JYH-MING LIEN DEPARTMENT OF COMPUTER SCIENCE GEORGE MASON UNIVERSITY Based on Tomas Akenine-Möller s lecture note You say that you render a 3D 2 scene, but what does

More information

Illumination & Shading

Illumination & Shading Illumination & Shading Goals Introduce the types of light-material interactions Build a simple reflection model---the Phong model--- that can be used with real time graphics hardware Why we need Illumination

More information

LEVEL 1 ANIMATION ACADEMY2010

LEVEL 1 ANIMATION ACADEMY2010 1 Textures add more realism to an environment and characters. There are many 2D painting programs that can be used to create textures, such as Adobe Photoshop and Corel Painter. Many artists use photographs

More information

Shading. Reading. Pinhole camera. Basic 3D graphics. Brian Curless CSE 557 Fall Required: Shirley, Chapter 10

Shading. Reading. Pinhole camera. Basic 3D graphics. Brian Curless CSE 557 Fall Required: Shirley, Chapter 10 Reading Required: Shirley, Chapter 10 Shading Brian Curless CSE 557 Fall 2014 1 2 Basic 3D graphics With affine matrices, we can now transform virtual 3D objects in their local coordinate systems into

More information

Assignment 6: Ray Tracing

Assignment 6: Ray Tracing Assignment 6: Ray Tracing Programming Lab Due: Monday, April 20 (midnight) 1 Introduction Throughout this semester you have written code that manipulated shapes and cameras to prepare a scene for rendering.

More information

https://ilearn.marist.edu/xsl-portal/tool/d4e4fd3a-a3...

https://ilearn.marist.edu/xsl-portal/tool/d4e4fd3a-a3... Assessment Preview - This is an example student view of this assessment done Exam 2 Part 1 of 5 - Modern Graphics Pipeline Question 1 of 27 Match each stage in the graphics pipeline with a description

More information

Reflection and Shading

Reflection and Shading Reflection and Shading R. J. Renka Department of Computer Science & Engineering University of North Texas 10/19/2015 Light Sources Realistic rendering requires that we model the interaction between light

More information

Advanced Real- Time Cel Shading Techniques in OpenGL Adam Hutchins Sean Kim

Advanced Real- Time Cel Shading Techniques in OpenGL Adam Hutchins Sean Kim Advanced Real- Time Cel Shading Techniques in OpenGL Adam Hutchins Sean Kim Cel shading, also known as toon shading, is a non- photorealistic rending technique that has been used in many animations and

More information

Institutionen för systemteknik

Institutionen för systemteknik Code: Day: Lokal: M7002E 19 March E1026 Institutionen för systemteknik Examination in: M7002E, Computer Graphics and Virtual Environments Number of sections: 7 Max. score: 100 (normally 60 is required

More information

Recollection. Models Pixels. Model transformation Viewport transformation Clipping Rasterization Texturing + Lights & shadows

Recollection. Models Pixels. Model transformation Viewport transformation Clipping Rasterization Texturing + Lights & shadows Recollection Models Pixels Model transformation Viewport transformation Clipping Rasterization Texturing + Lights & shadows Can be computed in different stages 1 So far we came to Geometry model 3 Surface

More information

Lighting and Shading

Lighting and Shading Lighting and Shading Today: Local Illumination Solving the rendering equation is too expensive First do local illumination Then hack in reflections and shadows Local Shading: Notation light intensity in,

More information

Ray Tracing: Special Topics CSCI 4239/5239 Advanced Computer Graphics Spring 2018

Ray Tracing: Special Topics CSCI 4239/5239 Advanced Computer Graphics Spring 2018 Ray Tracing: Special Topics CSCI 4239/5239 Advanced Computer Graphics Spring 2018 Theoretical foundations Ray Tracing from the Ground Up Chapters 13-15 Bidirectional Reflectance Distribution Function BRDF

More information

Global Illumination CS334. Daniel G. Aliaga Department of Computer Science Purdue University

Global Illumination CS334. Daniel G. Aliaga Department of Computer Science Purdue University Global Illumination CS334 Daniel G. Aliaga Department of Computer Science Purdue University Recall: Lighting and Shading Light sources Point light Models an omnidirectional light source (e.g., a bulb)

More information

INTERACTIVE VIRTUAL HAIR-DRESSING ROOM Nadia Magnenat-Thalmann, Melanie Montagnol, Rajeev Gupta, and Pascal Volino

INTERACTIVE VIRTUAL HAIR-DRESSING ROOM Nadia Magnenat-Thalmann, Melanie Montagnol, Rajeev Gupta, and Pascal Volino 1 INTERACTIVE VIRTUAL HAIR-DRESSING ROOM Nadia Magnenat-Thalmann, Melanie Montagnol, Rajeev Gupta, and Pascal Volino MIRALab - University of Geneva (thalmann, montagnol, gupta, volino)@miralab.unige.ch

More information

Computer Graphics. Illumination and Shading

Computer Graphics. Illumination and Shading Rendering Pipeline modelling of geometry transformation into world coordinates placement of cameras and light sources transformation into camera coordinates backface culling projection clipping w.r.t.

More information

Introduction Rasterization Z-buffering Shading. Graphics 2012/2013, 4th quarter. Lecture 09: graphics pipeline (rasterization and shading)

Introduction Rasterization Z-buffering Shading. Graphics 2012/2013, 4th quarter. Lecture 09: graphics pipeline (rasterization and shading) Lecture 9 Graphics pipeline (rasterization and shading) Graphics pipeline - part 1 (recap) Perspective projection by matrix multiplication: x pixel y pixel z canonical 1 x = M vpm per M cam y z 1 This

More information

CS 325 Computer Graphics

CS 325 Computer Graphics CS 325 Computer Graphics 04 / 02 / 2012 Instructor: Michael Eckmann Today s Topics Questions? Comments? Illumination modelling Ambient, Diffuse, Specular Reflection Surface Rendering / Shading models Flat

More information

CPSC / Illumination and Shading

CPSC / Illumination and Shading CPSC 599.64 / 601.64 Rendering Pipeline usually in one step modelling of geometry transformation into world coordinate system placement of cameras and light sources transformation into camera coordinate

More information

From Graphics to Visualization

From Graphics to Visualization From Graphics to Visualization Introduction Light Sources Surface Lighting Effects Basic (Local ) Illumination Models Polgon-Rendering Methods Texture Mapping Transparenc and Blending Visualization Pipeline

More information

Illumination In Diverse Codimensions

Illumination In Diverse Codimensions Illumination In Diverse Codimensions David C. Banks Institute for Computer Applications in Science and Engineering ABSTRACT This paper considers an idealized subclass of surface reflectivities; namely,

More information

Introduction to Visualization and Computer Graphics

Introduction to Visualization and Computer Graphics Introduction to Visualization and Computer Graphics DH2320, Fall 2015 Prof. Dr. Tino Weinkauf Introduction to Visualization and Computer Graphics Visibility Shading 3D Rendering Geometric Model Color Perspective

More information

Ambien Occlusion. Lighting: Ambient Light Sources. Lighting: Ambient Light Sources. Summary

Ambien Occlusion. Lighting: Ambient Light Sources. Lighting: Ambient Light Sources. Summary Summary Ambien Occlusion Kadi Bouatouch IRISA Email: kadi@irisa.fr 1. Lighting 2. Definition 3. Computing the ambient occlusion 4. Ambient occlusion fields 5. Dynamic ambient occlusion 1 2 Lighting: Ambient

More information

Visualisatie BMT. Rendering. Arjan Kok

Visualisatie BMT. Rendering. Arjan Kok Visualisatie BMT Rendering Arjan Kok a.j.f.kok@tue.nl 1 Lecture overview Color Rendering Illumination 2 Visualization pipeline Raw Data Data Enrichment/Enhancement Derived Data Visualization Mapping Abstract

More information

Lighting and Shading. Slides: Tamar Shinar, Victor Zordon

Lighting and Shading. Slides: Tamar Shinar, Victor Zordon Lighting and Shading Slides: Tamar Shinar, Victor Zordon Why we need shading Suppose we build a model of a sphere using many polygons and color each the same color. We get something like But we want 2

More information

Pen & Ink Illustration

Pen & Ink Illustration Pen & Ink Illustration Georges Winkenbach David H. Salesin Presented by: Andreas Loizias Reasons To communicate complex information more effectively through abstraction Convey information better by omitting

More information

Computer Graphics I Lecture 11

Computer Graphics I Lecture 11 15-462 Computer Graphics I Lecture 11 Midterm Review Assignment 3 Movie Midterm Review Midterm Preview February 26, 2002 Frank Pfenning Carnegie Mellon University http://www.cs.cmu.edu/~fp/courses/graphics/

More information

4) Finish the spline here. To complete the spline, double click the last point or select the spline tool again.

4) Finish the spline here. To complete the spline, double click the last point or select the spline tool again. 1) Select the line tool 3) Move the cursor along the X direction (be careful to stay on the X axis alignment so that the line is perpendicular) and click for the second point of the line. Type 0.5 for

More information

Interactive Virtual Hair-Dressing Room

Interactive Virtual Hair-Dressing Room 535 Interactive Virtual Hair-Dressing Room Nadia Magnenat-Thalmann, Melanie Montagnol, Rajeev Gupta and Pascal Volino MIRALab - University of Geneva (TUthalmannUT, TUmontagnolUT, TUguptaUT, TUvolinoUT)U@miralab.unige.chU

More information

Computer Graphics. Shadows

Computer Graphics. Shadows Computer Graphics Lecture 10 Shadows Taku Komura Today Shadows Overview Projective shadows Shadow texture Shadow volume Shadow map Soft shadows Why Shadows? Shadows tell us about the relative locations

More information

1.6 Rough Surface Scattering Applications Computer Graphic Shading and Rendering

1.6 Rough Surface Scattering Applications Computer Graphic Shading and Rendering 20 Durgin ECE 3065 Notes Rough Surface Scattering Chapter 1 1.6 Rough Surface Scattering Applications 1.6.1 Computer Graphic Shading and Rendering At optical frequencies, nearly every object in our everyday

More information

Topics and things to know about them:

Topics and things to know about them: Practice Final CMSC 427 Distributed Tuesday, December 11, 2007 Review Session, Monday, December 17, 5:00pm, 4424 AV Williams Final: 10:30 AM Wednesday, December 19, 2007 General Guidelines: The final will

More information

CEng 477 Introduction to Computer Graphics Fall 2007

CEng 477 Introduction to Computer Graphics Fall 2007 Visible Surface Detection CEng 477 Introduction to Computer Graphics Fall 2007 Visible Surface Detection Visible surface detection or hidden surface removal. Realistic scenes: closer objects occludes the

More information

Today. Global illumination. Shading. Interactive applications. Rendering pipeline. Computergrafik. Shading Introduction Local shading models

Today. Global illumination. Shading. Interactive applications. Rendering pipeline. Computergrafik. Shading Introduction Local shading models Computergrafik Matthias Zwicker Universität Bern Herbst 2009 Today Introduction Local shading models Light sources strategies Compute interaction of light with surfaces Requires simulation of physics Global

More information

CS770/870 Spring 2017 Color and Shading

CS770/870 Spring 2017 Color and Shading Preview CS770/870 Spring 2017 Color and Shading Related material Cunningham: Ch 5 Hill and Kelley: Ch. 8 Angel 5e: 6.1-6.8 Angel 6e: 5.1-5.5 Making the scene more realistic Color models representing the

More information

COMP 175 COMPUTER GRAPHICS. Lecture 11: Recursive Ray Tracer. COMP 175: Computer Graphics April 9, Erik Anderson 11 Recursive Ray Tracer

COMP 175 COMPUTER GRAPHICS. Lecture 11: Recursive Ray Tracer. COMP 175: Computer Graphics April 9, Erik Anderson 11 Recursive Ray Tracer Lecture 11: Recursive Ray Tracer COMP 175: Computer Graphics April 9, 2018 1/40 Note on using Libraries } C++ STL } Does not always have the same performance. } Interface is (mostly) the same, but implementations

More information

WHY WE NEED SHADING. Suppose we build a model of a sphere using many polygons and color it with glcolor. We get something like.

WHY WE NEED SHADING. Suppose we build a model of a sphere using many polygons and color it with glcolor. We get something like. LIGHTING 1 OUTLINE Learn to light/shade objects so their images appear three-dimensional Introduce the types of light-material interactions Build a simple reflection model---the Phong model--- that can

More information

CS4620/5620: Lecture 14 Pipeline

CS4620/5620: Lecture 14 Pipeline CS4620/5620: Lecture 14 Pipeline 1 Rasterizing triangles Summary 1! evaluation of linear functions on pixel grid 2! functions defined by parameter values at vertices 3! using extra parameters to determine

More information

The Rasterizer Stage. Texturing, Lighting, Testing and Blending

The Rasterizer Stage. Texturing, Lighting, Testing and Blending 1 The Rasterizer Stage Texturing, Lighting, Testing and Blending 2 Triangle Setup, Triangle Traversal and Back Face Culling From Primitives To Fragments Post Clipping 3 In the last stages of the geometry

More information

Computer Graphics (CS 4731) Lecture 16: Lighting, Shading and Materials (Part 1)

Computer Graphics (CS 4731) Lecture 16: Lighting, Shading and Materials (Part 1) Computer Graphics (CS 4731) Lecture 16: Lighting, Shading and Materials (Part 1) Prof Emmanuel Agu Computer Science Dept. Worcester Polytechnic Institute (WPI) Why do we need Lighting & shading? Sphere

More information

Point based global illumination is now a standard tool for film quality renderers. Since it started out as a real time technique it is only natural

Point based global illumination is now a standard tool for film quality renderers. Since it started out as a real time technique it is only natural 1 Point based global illumination is now a standard tool for film quality renderers. Since it started out as a real time technique it is only natural to consider using it in video games too. 2 I hope that

More information

COMP 4801 Final Year Project. Ray Tracing for Computer Graphics. Final Project Report FYP Runjing Liu. Advised by. Dr. L.Y.

COMP 4801 Final Year Project. Ray Tracing for Computer Graphics. Final Project Report FYP Runjing Liu. Advised by. Dr. L.Y. COMP 4801 Final Year Project Ray Tracing for Computer Graphics Final Project Report FYP 15014 by Runjing Liu Advised by Dr. L.Y. Wei 1 Abstract The goal of this project was to use ray tracing in a rendering

More information

Level of Details in Computer Rendering

Level of Details in Computer Rendering Level of Details in Computer Rendering Ariel Shamir Overview 1. Photo realism vs. Non photo realism (NPR) 2. Objects representations 3. Level of details Photo Realism Vs. Non Pixar Demonstrations Sketching,

More information

9. Visible-Surface Detection Methods

9. Visible-Surface Detection Methods 9. Visible-Surface Detection Methods More information about Modelling and Perspective Viewing: Before going to visible surface detection, we first review and discuss the followings: 1. Modelling Transformation:

More information

CPSC 314 LIGHTING AND SHADING

CPSC 314 LIGHTING AND SHADING CPSC 314 LIGHTING AND SHADING UGRAD.CS.UBC.CA/~CS314 slide credits: Mikhail Bessmeltsev et al 1 THE RENDERING PIPELINE Vertices and attributes Vertex Shader Modelview transform Per-vertex attributes Vertex

More information

graphics pipeline computer graphics graphics pipeline 2009 fabio pellacini 1

graphics pipeline computer graphics graphics pipeline 2009 fabio pellacini 1 graphics pipeline computer graphics graphics pipeline 2009 fabio pellacini 1 graphics pipeline sequence of operations to generate an image using object-order processing primitives processed one-at-a-time

More information

Overview. Shading. Shading. Why we need shading. Shading Light-material interactions Phong model Shading polygons Shading in OpenGL

Overview. Shading. Shading. Why we need shading. Shading Light-material interactions Phong model Shading polygons Shading in OpenGL Overview Shading Shading Light-material interactions Phong model Shading polygons Shading in OpenGL Why we need shading Suppose we build a model of a sphere using many polygons and color it with glcolor.

More information