TSBK03 Screen-Space Ambient Occlusion

Size: px
Start display at page:

Download "TSBK03 Screen-Space Ambient Occlusion"

Transcription

1 TSBK03 Screen-Space Ambient Occlusion Joakim Gebart, Jimmy Liikala December 15, 2013 Contents 1 Abstract 1 2 History Crysis method Chosen method Algorithm outline Blur Range check Noise Depth buffer and per-fragment normal vectors SSAO into the depth Retrieving a per fragment depth normal-map Reconstructing the depth Generate random points in hemisphere Computing the occlusion OpenGL Tessellation Wireframe using geometry shaders (GS) Result 8 1 Abstract Ambient occlusion methods are approximations of the shading that comes from ambient light being occluded by nearby geometry, this allows for a simulation of soft proximity of the middle frequency shadows. Ambient occlusion is used to scale the value of the ambient light in each point, which leads to that points that are less occluded will become brighter and occluded points will become darker because less light probably will hit the occluded spots. Screen-Space Ambient Occlusion (SSAO) is a collection of methods where the calculations are done in screen-space, i.e. for each fragment. Calculating the ambient occlusion for every point on every surface would not be feasible in real time, but reducing the number of points to only the rendered pixels enables us to use these methods using today s GPU hardware-technology. 1

2 2 History Offline calculations of ambient occlusion have been possible for quite some time by generating rays of light to trace, but the first implementation of SSAO in real time in a larger game project was achieved by Crytek [X] in their engine named CryEngine2, used for the games such as Crysis (year 2007). Today (2012) every state-of-the-art game engine has some form of ambient occlusion implementation. 2.1 Crysis method The algorithm used in Crysis samples a number of random points in a sphere centered around each rendered point. The occlusion factor is calculated from the number of those samples that are behind geometry, where the depth value in the sampled point is greater than the value of the depth buffer. Since all modern game engines already do additional post-processing steps where the depth data will be available for no extra render cost since it is already being used by other parts of the post-processing shaders. The occlusion factor is used to darken the occluded parts of the image in a post-processing stage. Because the random points are sampled from a sphere in the Crysis method there will in most cases be points that are inside geometry, which makes convex surfaces appear brighter than flat walls in addition to darkening concave surfaces. This effect is not photorealistic but it can be mitigated by choosing samples in other ways, but this is also something that gives the graphic a special touch/feeling that some people tend to enjoy. 3 Chosen method The method of choice for the computation of the ambient occlusion was chosen as a variant of the Crysis method, where the random samples are picked inside a normal-oriented hemisphere (see figure 1) on the rendered surface instead of a sphere as in the original implementation used by Crytek. This sampling makes flat surfaces have the same occlusion factor as convex surfaces while still darkening concave surfaces, resulting in a somewhat more realistic appearance. Figure 1: Sample points from inside a hemisphere. 2

3 One drawback of this method is that the surface normal needs to be computed per fragment, if not already available. However, since surface normals are usually needed for per pixel lighting or some other post-processing effects, they will most likely already be available for no extra cost in a real life situation when using a modern graphics engine. 3.1 Algorithm outline The algorithm is implemented as follows: For each fragment: Generate a number of random sample points in a hemisphere around the fragment. Project the sample points into screen space to find the matching value in the depth buffer. Compare depth buffer value against the depth value of the sample point, if the sample point depth value is greater than the depth buffer value the sample point is occluded, increment occlusion factor. The occlusion factor can then be used in a post-processing step to achieve ambient occlusion. 3.2 Blur In order to keep framerates interactive the number of random samples have to be kept to a minimum. The result of this is that the number of bits of numeric precision on the occlusion factor will be less than what is required for nice, smooth shading. To reduce the problem, a blur is applied to the occlusion factor buffer before combining the occlusion factor with the post-processing input image, which yields a smoother shading of the image. 3.3 Range check If the blur is applied to the occlusion factor buffer as a whole without any conditions the result would be that shadows bloom and can darken parts that are far behind or in front of the actual darkened corner. In order to eliminate this effect, a condition is added to the blur shader that computes the z-distance between two points in the occlusion factor buffer and only apply blur between points that are close to each other in the z-direction. 3.4 Noise It is difficult to efficiently generate random numbers in a shader program. Therefore, instead of performing expensive mathematical operations for each fragment, a texture containing random white noise is used as a source of random numbers. By using the fragment coordinate as an offset into the texture a random number can be generated that is not the same across all fragments. The fragment coordinate can also be multiplied with different prime factors on each random draw in the shader code to achieve a period that is longer than the size of the texture. The random numbers are used to build vectors that form the points in the sampling hemisphere. The repeating pattern of the random texture will cause visible repetition in the result. To further increase the period of the pattern, the sampling vectors are also rotated along the normal axis using a value from the random texture as well. 3

4 3.5 Depth buffer and per-fragment normal vectors The depth buffer must be rendered to a framebuffer object (FBO) before computing the occlusion factor, because of the coupling between the occlusion factor and nearby fragments depth value. The normals which in the three-dimensional world has has three components are saved in the red, green and blue channel of the FBO, while the depth values are stored in the alpha channel. One important thing regarding the depth is that the depth-buffer (or often referred to as the z-buffer) is not linear. The z-buffer has higher resolution (or more samples if you look at it that way) closer to the camera and resolution decreases the further away from the camera it goes. The linear depth is in this case obtained first by computing the per-vertex depth which is found by applying the Model-view matrix on each vertex (in the vertex-shader) and thereafter the depth is saved into the alpha channel in the fragment-shader, depth values in between vertex-points will be interpolated in the fragment shader. The depth values are normalized by dividing the depth by the distance between the near-plane and far-plane of the camera, this is done to simplify debugging, figure 2 shows an example of a normalized, linear depth buffer. Figure 2: Depth buffer with normalized values. One important thing regarding normals are to keep them in correct space, no translation or perspective division should be applied to the normals. The normal should stay perpendicular to the the surface. This is achieved by applying the inverse transpose of the upper left part of the Modelview matrix (also known as the Normal matrix) on the normals. 4

5 4 SSAO into the depth The Screen Space Ambient Occlusion is in this case (and in most cases) computed on the Graphic Processing Unit (GPU) using the OpenGL Shading Language (GLSL), which enables us to make realtime calculations at high speed. In this chapter the SSAO algorithm of choice is described in better detail. Screen Space Ambient occlusion is in this chapter explained and divided into the following steps: Retrieving a per fragment depth normal-map Reconstructing the depth Generate random points in hemisphere Computing the occlusion 4.1 Retrieving a per fragment depth normal-map The first thing that had to be found and stored for later use is the the depthmap from the cameras point of view, often referred as camera-space, view-space or eye-space. The depth were to be stored in a texture, at the same time also the normals were saved for later use. In this case the texture used is a RGBA-texture contained four channels Red, Green, Blue and Alpha. The normals which in the three-dimensional world has has three components were saved in the red, green and blue channel of the texture, while the depth were stored in the alpha channel. One important thing regarding the depth is that the depth-buffer (or often referred to as the z-buffer) is not linear. The z-buffer has higher resolution (or more samples if you look at it that way) closer to the camera and continues to decrease the further away from the camera it goes. The higher resolution closer to the camera is because it is more important that everything looks correct closer to the camera, or in clear words, it is more important that the z-buffer culling is correct close to the camera so we will minimize the risk to see artifact due to the culling. The linear depth is in this case obtained first by computing the per-vertex depth which is found by applying the Model-view matrix on each vertex (in the vertex-shader) and thereafter the depth is saved into the alpha channel in the fragment-shader, depth values in between vertex-points will be interpolated in the fragment shader. If normalizing the depth values by dividing the depth by the distance between the near-plane and far-plane of the camera values between zero and one will be obtained for the depth, which makes it rather simple to use the depth as output to the screen, just to confirm it is correct. In OpenGL the obtained depth values are also multiplied with minus one, because in OpenGL we look in the negative z-direction. In cases everything is done correctly the output should look something like in figure 2, or similar (depending on the scene). One important thing regarding normals are to keep them in correct space, no translation or perspective division should be applied to the normals. The normal should stay perpendicular to the the surface. This can be achieved by applying the inverse transpose of the upper left part of the Modelview-matrix figure 3 on the normals. 4.2 Reconstructing the depth To obtain the correct depth-value for each pixel a quad is used as input to the SSAO-shader. The quad is built up by four vertices, according to figure 4 which in the vertex-shader are stored as 5

6 Figure 3: llustration of the upper left part of the modelview-matrix (marked as M1) that should be used to create the normal matrix. Figure 4: Illustrates the quad and its positions of the vertex points that are to be used to access a pixels correct corresponding depth-value from the texture. texture-coordinates to obtain the correct depth value corresponding to each pixel. The screencoordinates in GLSL are always between -1.0 and 1.0 in the xy-plane, therefore the glposition that is stored in the vertex-shader has to be transformed into that range figure 5. At this stage the fragment-shader is now able interpolating the correct texture-coordinate for the correct corresponding pixel on the screen. In the fragment-shdaer step the depth is retrived from the depth-texture, using the obtained coordinates and thereafter adding up the the near-viewplane distance to obtain the exact correct depth. Adding up the distance to the near-viewplane is not absolutely necessary since that distance is the same to each pixel. 4.3 Generate random points in hemisphere Generating random -point in a hemispehere around each fragment-point can also be achived in many different ways, in this case a noisy-texture figure 6 is used to obtain randomized values. Each pixel of the texture containes three components (red, green blue), which are used as x,y and z-components to build the random-vector. Since the texture has values between 0 and 1, 6

7 Figure 5: Illustrates the transformed vertex points of the quad that are used as glposition in the vertex-shader in order to get the correct interpolated coordinates in the fragment-shader. Figure 6: The noisy-texture used to obtain randomized points inside a a hemisphere each value is multiplied by 2 and thereafter 1 is subracted to obtain values inbetween -1 and 1. Once the random-vector is obtained it will be a random point inside a sphere. In order to make it to be inside a hemisphere aligning the normal the random-vector has to be substracted by normal (normal randomv ector). At this point a random-point is obtained inside a hemisphere aligned to the normal, now a cross-product is made between the normal and the random-vector to obtain a new point inside the hemisphere which is choosen to be called v2.. Now three perpendicular vectors are obtained inside a hemisphere aligned to the normal which are the normal itself, the random vector and v2. These three vectors are used as a new basis in the next step, to easily generate several random positions inside the hemisphere. 7

8 4.4 Computing the occlusion 5 OpenGL 4 Another goal of the project was to utilize some new features of the OpenGL 4 pipeline. The most prominent such new features are the tessellation shader stages that can be used for on-the-fly geometry modifications. 5.1 Tessellation A new feature of OpenGL 4 and DirectX 11 is the new tessellation pipeline. The tessellation in OpenGL 4 is designed to do surface subdivisions in hardware in order to get a mesh with higher resolution, which can then be transformed in various ways; one straightforward example is the use of a displacement map to generate surface features such as for example the grooves and ridges between stones in a stone wall. An appreciated feature of the tessellation pipeline is that the level of subdivision can be controlled per primitive, therefore, it is possible to have a smooth level-of-detail transition based on distance from the eye. A user controlled implementation of the tessellation level has been implemented for purpose of testing the new tessellation features of the pipeline. The subdivision levels can be altered by pressing keys on the keyboard in the example implementation. 5.2 Wireframe using geometry shaders (GS) The use of geometry shaders should be minimized as much as possible, due to performance. In some cases a geometry shader may be the way to go, like in the case of building up a wireframe. In a geometry shaders access can to neighbouring primitives can be obtained, which can be useful for example in a case when building up a wireframe. In order to evaluate the tessellation result, it is useful to be able to view the wireframe of the resulting mesh. A geometry shader was developed which allows the user to display both the original mesh and the subdivided mesh, in separate colours, to distinguish them from each other. This is achieved by computing the distance from the fragment to the triangle edge, and the distance from the fragment to the nearest subdivided triangle edge. 6 Result Figure 7 shows an example of the final result of the SSAO shading. 8

9 Figure 7: Example of the SSAO shading 9

Screen Space Ambient Occlusion TSBK03: Advanced Game Programming

Screen Space Ambient Occlusion TSBK03: Advanced Game Programming Screen Space Ambient Occlusion TSBK03: Advanced Game Programming August Nam-Ki Ek, Oscar Johnson and Ramin Assadi March 5, 2015 This project report discusses our approach of implementing Screen Space Ambient

More information

Pipeline Operations. CS 4620 Lecture Steve Marschner. Cornell CS4620 Spring 2018 Lecture 11

Pipeline Operations. CS 4620 Lecture Steve Marschner. Cornell CS4620 Spring 2018 Lecture 11 Pipeline Operations CS 4620 Lecture 11 1 Pipeline you are here APPLICATION COMMAND STREAM 3D transformations; shading VERTEX PROCESSING TRANSFORMED GEOMETRY conversion of primitives to pixels RASTERIZATION

More information

Pipeline Operations. CS 4620 Lecture 14

Pipeline Operations. CS 4620 Lecture 14 Pipeline Operations CS 4620 Lecture 14 2014 Steve Marschner 1 Pipeline you are here APPLICATION COMMAND STREAM 3D transformations; shading VERTEX PROCESSING TRANSFORMED GEOMETRY conversion of primitives

More information

Computergrafik. Matthias Zwicker. Herbst 2010

Computergrafik. Matthias Zwicker. Herbst 2010 Computergrafik Matthias Zwicker Universität Bern Herbst 2010 Today Bump mapping Shadows Shadow mapping Shadow mapping in OpenGL Bump mapping Surface detail is often the result of small perturbations in

More information

CMSC427 Advanced shading getting global illumination by local methods. Credit: slides Prof. Zwicker

CMSC427 Advanced shading getting global illumination by local methods. Credit: slides Prof. Zwicker CMSC427 Advanced shading getting global illumination by local methods Credit: slides Prof. Zwicker Topics Shadows Environment maps Reflection mapping Irradiance environment maps Ambient occlusion Reflection

More information

CS 4620 Program 3: Pipeline

CS 4620 Program 3: Pipeline CS 4620 Program 3: Pipeline out: Wednesday 14 October 2009 due: Friday 30 October 2009 1 Introduction In this assignment, you will implement several types of shading in a simple software graphics pipeline.

More information

Shadows. COMP 575/770 Spring 2013

Shadows. COMP 575/770 Spring 2013 Shadows COMP 575/770 Spring 2013 Shadows in Ray Tracing Shadows are important for realism Basic idea: figure out whether a point on an object is illuminated by a light source Easy for ray tracers Just

More information

Rendering Algorithms: Real-time indirect illumination. Spring 2010 Matthias Zwicker

Rendering Algorithms: Real-time indirect illumination. Spring 2010 Matthias Zwicker Rendering Algorithms: Real-time indirect illumination Spring 2010 Matthias Zwicker Today Real-time indirect illumination Ray tracing vs. Rasterization Screen space techniques Visibility & shadows Instant

More information

Rendering. Converting a 3D scene to a 2D image. Camera. Light. Rendering. View Plane

Rendering. Converting a 3D scene to a 2D image. Camera. Light. Rendering. View Plane Rendering Pipeline Rendering Converting a 3D scene to a 2D image Rendering Light Camera 3D Model View Plane Rendering Converting a 3D scene to a 2D image Basic rendering tasks: Modeling: creating the world

More information

Adaptive Point Cloud Rendering

Adaptive Point Cloud Rendering 1 Adaptive Point Cloud Rendering Project Plan Final Group: May13-11 Christopher Jeffers Eric Jensen Joel Rausch Client: Siemens PLM Software Client Contact: Michael Carter Adviser: Simanta Mitra 4/29/13

More information

Pipeline Operations. CS 4620 Lecture 10

Pipeline Operations. CS 4620 Lecture 10 Pipeline Operations CS 4620 Lecture 10 2008 Steve Marschner 1 Hidden surface elimination Goal is to figure out which color to make the pixels based on what s in front of what. Hidden surface elimination

More information

Computer Graphics Shadow Algorithms

Computer Graphics Shadow Algorithms Computer Graphics Shadow Algorithms Computer Graphics Computer Science Department University of Freiburg WS 11 Outline introduction projection shadows shadow maps shadow volumes conclusion Motivation shadows

More information

Real-Time Rendering (Echtzeitgraphik) Michael Wimmer

Real-Time Rendering (Echtzeitgraphik) Michael Wimmer Real-Time Rendering (Echtzeitgraphik) Michael Wimmer wimmer@cg.tuwien.ac.at Walking down the graphics pipeline Application Geometry Rasterizer What for? Understanding the rendering pipeline is the key

More information

CS451Real-time Rendering Pipeline

CS451Real-time Rendering Pipeline 1 CS451Real-time Rendering Pipeline JYH-MING LIEN DEPARTMENT OF COMPUTER SCIENCE GEORGE MASON UNIVERSITY Based on Tomas Akenine-Möller s lecture note You say that you render a 3D 2 scene, but what does

More information

Module Contact: Dr Stephen Laycock, CMP Copyright of the University of East Anglia Version 1

Module Contact: Dr Stephen Laycock, CMP Copyright of the University of East Anglia Version 1 UNIVERSITY OF EAST ANGLIA School of Computing Sciences Main Series PG Examination 2013-14 COMPUTER GAMES DEVELOPMENT CMPSME27 Time allowed: 2 hours Answer any THREE questions. (40 marks each) Notes are

More information

Orthogonal Projection Matrices. Angel and Shreiner: Interactive Computer Graphics 7E Addison-Wesley 2015

Orthogonal Projection Matrices. Angel and Shreiner: Interactive Computer Graphics 7E Addison-Wesley 2015 Orthogonal Projection Matrices 1 Objectives Derive the projection matrices used for standard orthogonal projections Introduce oblique projections Introduce projection normalization 2 Normalization Rather

More information

Computer Graphics 10 - Shadows

Computer Graphics 10 - Shadows Computer Graphics 10 - Shadows Tom Thorne Slides courtesy of Taku Komura www.inf.ed.ac.uk/teaching/courses/cg Overview Shadows Overview Projective shadows Shadow textures Shadow volume Shadow map Soft

More information

Computer Graphics. Shadows

Computer Graphics. Shadows Computer Graphics Lecture 10 Shadows Taku Komura Today Shadows Overview Projective shadows Shadow texture Shadow volume Shadow map Soft shadows Why Shadows? Shadows tell us about the relative locations

More information

Game Architecture. 2/19/16: Rasterization

Game Architecture. 2/19/16: Rasterization Game Architecture 2/19/16: Rasterization Viewing To render a scene, need to know Where am I and What am I looking at The view transform is the matrix that does this Maps a standard view space into world

More information

Rasterization Overview

Rasterization Overview Rendering Overview The process of generating an image given a virtual camera objects light sources Various techniques rasterization (topic of this course) raytracing (topic of the course Advanced Computer

More information

DEFERRED RENDERING STEFAN MÜLLER ARISONA, ETH ZURICH SMA/

DEFERRED RENDERING STEFAN MÜLLER ARISONA, ETH ZURICH SMA/ DEFERRED RENDERING STEFAN MÜLLER ARISONA, ETH ZURICH SMA/2013-11-04 DEFERRED RENDERING? CONTENTS 1. The traditional approach: Forward rendering 2. Deferred rendering (DR) overview 3. Example uses of DR:

More information

CS 130 Final. Fall 2015

CS 130 Final. Fall 2015 CS 130 Final Fall 2015 Name Student ID Signature You may not ask any questions during the test. If you believe that there is something wrong with a question, write down what you think the question is trying

More information

Many rendering scenarios, such as battle scenes or urban environments, require rendering of large numbers of autonomous characters.

Many rendering scenarios, such as battle scenes or urban environments, require rendering of large numbers of autonomous characters. 1 2 Many rendering scenarios, such as battle scenes or urban environments, require rendering of large numbers of autonomous characters. Crowd rendering in large environments presents a number of challenges,

More information

C P S C 314 S H A D E R S, O P E N G L, & J S RENDERING PIPELINE. Mikhail Bessmeltsev

C P S C 314 S H A D E R S, O P E N G L, & J S RENDERING PIPELINE. Mikhail Bessmeltsev C P S C 314 S H A D E R S, O P E N G L, & J S RENDERING PIPELINE UGRAD.CS.UBC.C A/~CS314 Mikhail Bessmeltsev 1 WHAT IS RENDERING? Generating image from a 3D scene 2 WHAT IS RENDERING? Generating image

More information

Computer Graphics CS 543 Lecture 13a Curves, Tesselation/Geometry Shaders & Level of Detail

Computer Graphics CS 543 Lecture 13a Curves, Tesselation/Geometry Shaders & Level of Detail Computer Graphics CS 54 Lecture 1a Curves, Tesselation/Geometry Shaders & Level of Detail Prof Emmanuel Agu Computer Science Dept. Worcester Polytechnic Institute (WPI) So Far Dealt with straight lines

More information

TDA362/DIT223 Computer Graphics EXAM (Same exam for both CTH- and GU students)

TDA362/DIT223 Computer Graphics EXAM (Same exam for both CTH- and GU students) TDA362/DIT223 Computer Graphics EXAM (Same exam for both CTH- and GU students) Saturday, January 13 th, 2018, 08:30-12:30 Examiner Ulf Assarsson, tel. 031-772 1775 Permitted Technical Aids None, except

More information

Deferred Rendering Due: Wednesday November 15 at 10pm

Deferred Rendering Due: Wednesday November 15 at 10pm CMSC 23700 Autumn 2017 Introduction to Computer Graphics Project 4 November 2, 2017 Deferred Rendering Due: Wednesday November 15 at 10pm 1 Summary This assignment uses the same application architecture

More information

CHAPTER 1 Graphics Systems and Models 3

CHAPTER 1 Graphics Systems and Models 3 ?????? 1 CHAPTER 1 Graphics Systems and Models 3 1.1 Applications of Computer Graphics 4 1.1.1 Display of Information............. 4 1.1.2 Design.................... 5 1.1.3 Simulation and Animation...........

More information

CS452/552; EE465/505. Clipping & Scan Conversion

CS452/552; EE465/505. Clipping & Scan Conversion CS452/552; EE465/505 Clipping & Scan Conversion 3-31 15 Outline! From Geometry to Pixels: Overview Clipping (continued) Scan conversion Read: Angel, Chapter 8, 8.1-8.9 Project#1 due: this week Lab4 due:

More information

Volume-based Ambient Occlusion with Voxel Fragmentation

Volume-based Ambient Occlusion with Voxel Fragmentation ITN, Norrköping December 21, 2012 Volume-based Ambient Occlusion with Voxel Fragmentation PROJECT IN TECHNOLOGY FOR ADVANCED COMPUTER GAMES TSBK03 Christopher Birger Erik Englesson Anders Hedblom chrbi049@student.liu.se

More information

Direct Rendering of Trimmed NURBS Surfaces

Direct Rendering of Trimmed NURBS Surfaces Direct Rendering of Trimmed NURBS Surfaces Hardware Graphics Pipeline 2/ 81 Hardware Graphics Pipeline GPU Video Memory CPU Vertex Processor Raster Unit Fragment Processor Render Target Screen Extended

More information

Rendering Objects. Need to transform all geometry then

Rendering Objects. Need to transform all geometry then Intro to OpenGL Rendering Objects Object has internal geometry (Model) Object relative to other objects (World) Object relative to camera (View) Object relative to screen (Projection) Need to transform

More information

EECS 487: Interactive Computer Graphics

EECS 487: Interactive Computer Graphics Shadow Mapping EECS 487: Interactive Computer Graphics Lecture 32: Interactive Visual Effects Shadow Map Ambient Occlusion A point is lit if it is visible from the light source similar to visible surface

More information

CS4620/5620: Lecture 14 Pipeline

CS4620/5620: Lecture 14 Pipeline CS4620/5620: Lecture 14 Pipeline 1 Rasterizing triangles Summary 1! evaluation of linear functions on pixel grid 2! functions defined by parameter values at vertices 3! using extra parameters to determine

More information

CS 130 Exam I. Fall 2015

CS 130 Exam I. Fall 2015 S 3 Exam I Fall 25 Name Student ID Signature You may not ask any questions during the test. If you believe that there is something wrong with a question, write down what you think the question is trying

More information

Deus Ex is in the Details

Deus Ex is in the Details Deus Ex is in the Details Augmenting the PC graphics of Deus Ex: Human Revolution using DirectX 11 technology Matthijs De Smedt Graphics Programmer, Nixxes Software Overview Introduction DirectX 11 implementation

More information

3D Programming. 3D Programming Concepts. Outline. 3D Concepts. 3D Concepts -- Coordinate Systems. 3D Concepts Displaying 3D Models

3D Programming. 3D Programming Concepts. Outline. 3D Concepts. 3D Concepts -- Coordinate Systems. 3D Concepts Displaying 3D Models 3D Programming Concepts Outline 3D Concepts Displaying 3D Models 3D Programming CS 4390 3D Computer 1 2 3D Concepts 3D Model is a 3D simulation of an object. Coordinate Systems 3D Models 3D Shapes 3D Concepts

More information

Scene Management. Video Game Technologies 11498: MSc in Computer Science and Engineering 11156: MSc in Game Design and Development

Scene Management. Video Game Technologies 11498: MSc in Computer Science and Engineering 11156: MSc in Game Design and Development Video Game Technologies 11498: MSc in Computer Science and Engineering 11156: MSc in Game Design and Development Chap. 5 Scene Management Overview Scene Management vs Rendering This chapter is about rendering

More information

graphics pipeline computer graphics graphics pipeline 2009 fabio pellacini 1

graphics pipeline computer graphics graphics pipeline 2009 fabio pellacini 1 graphics pipeline computer graphics graphics pipeline 2009 fabio pellacini 1 graphics pipeline sequence of operations to generate an image using object-order processing primitives processed one-at-a-time

More information

graphics pipeline computer graphics graphics pipeline 2009 fabio pellacini 1

graphics pipeline computer graphics graphics pipeline 2009 fabio pellacini 1 graphics pipeline computer graphics graphics pipeline 2009 fabio pellacini 1 graphics pipeline sequence of operations to generate an image using object-order processing primitives processed one-at-a-time

More information

Shadow Techniques. Sim Dietrich NVIDIA Corporation

Shadow Techniques. Sim Dietrich NVIDIA Corporation Shadow Techniques Sim Dietrich NVIDIA Corporation sim.dietrich@nvidia.com Lighting & Shadows The shadowing solution you choose can greatly influence the engine decisions you make This talk will outline

More information

Chapter 7 - Light, Materials, Appearance

Chapter 7 - Light, Materials, Appearance Chapter 7 - Light, Materials, Appearance Types of light in nature and in CG Shadows Using lights in CG Illumination models Textures and maps Procedural surface descriptions Literature: E. Angel/D. Shreiner,

More information

Shaders. Slide credit to Prof. Zwicker

Shaders. Slide credit to Prof. Zwicker Shaders Slide credit to Prof. Zwicker 2 Today Shader programming 3 Complete model Blinn model with several light sources i diffuse specular ambient How is this implemented on the graphics processor (GPU)?

More information

Computer graphics 2: Graduate seminar in computational aesthetics

Computer graphics 2: Graduate seminar in computational aesthetics Computer graphics 2: Graduate seminar in computational aesthetics Angus Forbes evl.uic.edu/creativecoding/cs526 Homework 2 RJ ongoing... - Follow the suggestions in the Research Journal handout and find

More information

Surface Graphics. 200 polys 1,000 polys 15,000 polys. an empty foot. - a mesh of spline patches:

Surface Graphics. 200 polys 1,000 polys 15,000 polys. an empty foot. - a mesh of spline patches: Surface Graphics Objects are explicitely defined by a surface or boundary representation (explicit inside vs outside) This boundary representation can be given by: - a mesh of polygons: 200 polys 1,000

More information

Computer Graphics. Lecture 9 Environment mapping, Mirroring

Computer Graphics. Lecture 9 Environment mapping, Mirroring Computer Graphics Lecture 9 Environment mapping, Mirroring Today Environment Mapping Introduction Cubic mapping Sphere mapping refractive mapping Mirroring Introduction reflection first stencil buffer

More information

CS 354R: Computer Game Technology

CS 354R: Computer Game Technology CS 354R: Computer Game Technology Texture and Environment Maps Fall 2018 Texture Mapping Problem: colors, normals, etc. are only specified at vertices How do we add detail between vertices without incurring

More information

The Traditional Graphics Pipeline

The Traditional Graphics Pipeline Last Time? The Traditional Graphics Pipeline Participating Media Measuring BRDFs 3D Digitizing & Scattering BSSRDFs Monte Carlo Simulation Dipole Approximation Today Ray Casting / Tracing Advantages? Ray

More information

Advanced Lighting Techniques Due: Monday November 2 at 10pm

Advanced Lighting Techniques Due: Monday November 2 at 10pm CMSC 23700 Autumn 2015 Introduction to Computer Graphics Project 3 October 20, 2015 Advanced Lighting Techniques Due: Monday November 2 at 10pm 1 Introduction This assignment is the third and final part

More information

CSE 167: Introduction to Computer Graphics Lecture #4: Vertex Transformation

CSE 167: Introduction to Computer Graphics Lecture #4: Vertex Transformation CSE 167: Introduction to Computer Graphics Lecture #4: Vertex Transformation Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2013 Announcements Project 2 due Friday, October 11

More information

Rendering If we have a precise computer representation of the 3D world, how realistic are the 2D images we can generate? What are the best way to mode

Rendering If we have a precise computer representation of the 3D world, how realistic are the 2D images we can generate? What are the best way to mode Graphic Pipeline 1 Rendering If we have a precise computer representation of the 3D world, how realistic are the 2D images we can generate? What are the best way to model 3D world? How to render them?

More information

The Rasterization Pipeline

The Rasterization Pipeline Lecture 5: The Rasterization Pipeline (and its implementation on GPUs) Computer Graphics CMU 15-462/15-662, Fall 2015 What you know how to do (at this point in the course) y y z x (w, h) z x Position objects

More information

Shadows in the graphics pipeline

Shadows in the graphics pipeline Shadows in the graphics pipeline Steve Marschner Cornell University CS 569 Spring 2008, 19 February There are a number of visual cues that help let the viewer know about the 3D relationships between objects

More information

LOD and Occlusion Christian Miller CS Fall 2011

LOD and Occlusion Christian Miller CS Fall 2011 LOD and Occlusion Christian Miller CS 354 - Fall 2011 Problem You want to render an enormous island covered in dense vegetation in realtime [Crysis] Scene complexity Many billions of triangles Many gigabytes

More information

Screen Space Ambient Occlusion. Daniel Kvarfordt & Benjamin Lillandt

Screen Space Ambient Occlusion. Daniel Kvarfordt & Benjamin Lillandt Screen Space Ambient Occlusion Daniel Kvarfordt & Benjamin Lillandt Ambient light Same from all directions. Lambertian shading doesn't show form well. Need shadows to see form. Global illumination can

More information

Real Time Rendering of Complex Height Maps Walking an infinite realistic landscape By: Jeffrey Riaboy Written 9/7/03

Real Time Rendering of Complex Height Maps Walking an infinite realistic landscape By: Jeffrey Riaboy Written 9/7/03 1 Real Time Rendering of Complex Height Maps Walking an infinite realistic landscape By: Jeffrey Riaboy Written 9/7/03 Table of Contents 1 I. Overview 2 II. Creation of the landscape using fractals 3 A.

More information

CS230 : Computer Graphics Lecture 4. Tamar Shinar Computer Science & Engineering UC Riverside

CS230 : Computer Graphics Lecture 4. Tamar Shinar Computer Science & Engineering UC Riverside CS230 : Computer Graphics Lecture 4 Tamar Shinar Computer Science & Engineering UC Riverside Shadows Shadows for each pixel do compute viewing ray if ( ray hits an object with t in [0, inf] ) then compute

More information

Chapter 17: The Truth about Normals

Chapter 17: The Truth about Normals Chapter 17: The Truth about Normals What are Normals? When I first started with Blender I read about normals everywhere, but all I knew about them was: If there are weird black spots on your object, go

More information

Models and Architectures

Models and Architectures Models and Architectures Objectives Learn the basic design of a graphics system Introduce graphics pipeline architecture Examine software components for an interactive graphics system 1 Image Formation

More information

Applications of Explicit Early-Z Culling

Applications of Explicit Early-Z Culling Applications of Explicit Early-Z Culling Jason L. Mitchell ATI Research Pedro V. Sander ATI Research Introduction In past years, in the SIGGRAPH Real-Time Shading course, we have covered the details of

More information

Notes on Assignment. Notes on Assignment. Notes on Assignment. Notes on Assignment

Notes on Assignment. Notes on Assignment. Notes on Assignment. Notes on Assignment Notes on Assignment Notes on Assignment Objects on screen - made of primitives Primitives are points, lines, polygons - watch vertex ordering The main object you need is a box When the MODELVIEW matrix

More information

GUERRILLA DEVELOP CONFERENCE JULY 07 BRIGHTON

GUERRILLA DEVELOP CONFERENCE JULY 07 BRIGHTON Deferred Rendering in Killzone 2 Michal Valient Senior Programmer, Guerrilla Talk Outline Forward & Deferred Rendering Overview G-Buffer Layout Shader Creation Deferred Rendering in Detail Rendering Passes

More information

Real-Time Shadows. Last Time? Textures can Alias. Schedule. Questions? Quiz 1: Tuesday October 26 th, in class (1 week from today!

Real-Time Shadows. Last Time? Textures can Alias. Schedule. Questions? Quiz 1: Tuesday October 26 th, in class (1 week from today! Last Time? Real-Time Shadows Perspective-Correct Interpolation Texture Coordinates Procedural Solid Textures Other Mapping Bump Displacement Environment Lighting Textures can Alias Aliasing is the under-sampling

More information

E.Order of Operations

E.Order of Operations Appendix E E.Order of Operations This book describes all the performed between initial specification of vertices and final writing of fragments into the framebuffer. The chapters of this book are arranged

More information

VAO++: Practical Volumetric Ambient Occlusion for Games

VAO++: Practical Volumetric Ambient Occlusion for Games VAO++: Practical Volumetric Ambient Occlusion for Games Jakub Bokšanský, Adam Pospíšil (Project Wilberforce) Jiří Bittner (CTU in Prague) EGSR 19.6.2017 Motivation Focus on performance (highly optimized

More information

The Traditional Graphics Pipeline

The Traditional Graphics Pipeline Final Projects Proposals due Thursday 4/8 Proposed project summary At least 3 related papers (read & summarized) Description of series of test cases Timeline & initial task assignment The Traditional Graphics

More information

Advanced Real- Time Cel Shading Techniques in OpenGL Adam Hutchins Sean Kim

Advanced Real- Time Cel Shading Techniques in OpenGL Adam Hutchins Sean Kim Advanced Real- Time Cel Shading Techniques in OpenGL Adam Hutchins Sean Kim Cel shading, also known as toon shading, is a non- photorealistic rending technique that has been used in many animations and

More information

Mattan Erez. The University of Texas at Austin

Mattan Erez. The University of Texas at Austin EE382V: Principles in Computer Architecture Parallelism and Locality Fall 2008 Lecture 10 The Graphics Processing Unit Mattan Erez The University of Texas at Austin Outline What is a GPU? Why should we

More information

Dominic Filion, Senior Engineer Blizzard Entertainment. Rob McNaughton, Lead Technical Artist Blizzard Entertainment

Dominic Filion, Senior Engineer Blizzard Entertainment. Rob McNaughton, Lead Technical Artist Blizzard Entertainment Dominic Filion, Senior Engineer Blizzard Entertainment Rob McNaughton, Lead Technical Artist Blizzard Entertainment Screen-space techniques Deferred rendering Screen-space ambient occlusion Depth of Field

More information

Motivation. Culling Don t draw what you can t see! What can t we see? Low-level Culling

Motivation. Culling Don t draw what you can t see! What can t we see? Low-level Culling Motivation Culling Don t draw what you can t see! Thomas Larsson Mälardalen University April 7, 2016 Image correctness Rendering speed One day we will have enough processing power!? Goals of real-time

More information

Homework 3: Programmable Shaders

Homework 3: Programmable Shaders Homework 3: Programmable Shaders Introduction to Computer Graphics and Imaging (Summer 2012), Stanford University Due Monday, July 23, 11:59pm Warning: The coding portion of this homework involves features

More information

Real-Time Rendering of a Scene With Many Pedestrians

Real-Time Rendering of a Scene With Many Pedestrians 2015 http://excel.fit.vutbr.cz Real-Time Rendering of a Scene With Many Pedestrians Va clav Pfudl Abstract The aim of this text was to describe implementation of software that would be able to simulate

More information

Topics and things to know about them:

Topics and things to know about them: Practice Final CMSC 427 Distributed Tuesday, December 11, 2007 Review Session, Monday, December 17, 5:00pm, 4424 AV Williams Final: 10:30 AM Wednesday, December 19, 2007 General Guidelines: The final will

More information

Enhancing Traditional Rasterization Graphics with Ray Tracing. March 2015

Enhancing Traditional Rasterization Graphics with Ray Tracing. March 2015 Enhancing Traditional Rasterization Graphics with Ray Tracing March 2015 Introductions James Rumble Developer Technology Engineer Ray Tracing Support Justin DeCell Software Design Engineer Ray Tracing

More information

Lecture 13: Reyes Architecture and Implementation. Kayvon Fatahalian CMU : Graphics and Imaging Architectures (Fall 2011)

Lecture 13: Reyes Architecture and Implementation. Kayvon Fatahalian CMU : Graphics and Imaging Architectures (Fall 2011) Lecture 13: Reyes Architecture and Implementation Kayvon Fatahalian CMU 15-869: Graphics and Imaging Architectures (Fall 2011) A gallery of images rendered using Reyes Image credit: Lucasfilm (Adventures

More information

Computer Viewing. CS 537 Interactive Computer Graphics Prof. David E. Breen Department of Computer Science

Computer Viewing. CS 537 Interactive Computer Graphics Prof. David E. Breen Department of Computer Science Computer Viewing CS 537 Interactive Computer Graphics Prof. David E. Breen Department of Computer Science 1 Objectives Introduce the mathematics of projection Introduce OpenGL viewing functions Look at

More information

The Traditional Graphics Pipeline

The Traditional Graphics Pipeline Last Time? The Traditional Graphics Pipeline Reading for Today A Practical Model for Subsurface Light Transport, Jensen, Marschner, Levoy, & Hanrahan, SIGGRAPH 2001 Participating Media Measuring BRDFs

More information

Real Time Reflections Han-Wei Shen

Real Time Reflections Han-Wei Shen Real Time Reflections Han-Wei Shen Reflections One of the most noticeable effect of inter-object lighting Direct calculation of the physics (ray tracing) is too expensive Our focus is to capture the most

More information

Topic 10: Scene Management, Particle Systems and Normal Mapping. CITS4242: Game Design and Multimedia

Topic 10: Scene Management, Particle Systems and Normal Mapping. CITS4242: Game Design and Multimedia CITS4242: Game Design and Multimedia Topic 10: Scene Management, Particle Systems and Normal Mapping Scene Management Scene management means keeping track of all objects in a scene. - In particular, keeping

More information

Interpolation using scanline algorithm

Interpolation using scanline algorithm Interpolation using scanline algorithm Idea: Exploit knowledge about already computed color values. Traverse projected triangle top-down using scanline. Compute start and end color value of each pixel

More information

Real-time Shadow Mapping

Real-time Shadow Mapping Electrical Engineering Faculty Technion Israel Institute of Technology Haifa, ISRAEL Real-time Shadow Mapping Joined Project in Computer Graphics for Rafael Armament Development Authority Igor Naigovzin

More information

Today. Rendering pipeline. Rendering pipeline. Object vs. Image order. Rendering engine Rendering engine (jtrt) Computergrafik. Rendering pipeline

Today. Rendering pipeline. Rendering pipeline. Object vs. Image order. Rendering engine Rendering engine (jtrt) Computergrafik. Rendering pipeline Computergrafik Today Rendering pipeline s View volumes, clipping Viewport Matthias Zwicker Universität Bern Herbst 2008 Rendering pipeline Rendering pipeline Hardware & software that draws 3D scenes on

More information

After the release of Maxwell in September last year, a number of press articles appeared that describe VXGI simply as a technology to improve

After the release of Maxwell in September last year, a number of press articles appeared that describe VXGI simply as a technology to improve After the release of Maxwell in September last year, a number of press articles appeared that describe VXGI simply as a technology to improve lighting in games. While that is certainly true, it doesn t

More information

Point based global illumination is now a standard tool for film quality renderers. Since it started out as a real time technique it is only natural

Point based global illumination is now a standard tool for film quality renderers. Since it started out as a real time technique it is only natural 1 Point based global illumination is now a standard tool for film quality renderers. Since it started out as a real time technique it is only natural to consider using it in video games too. 2 I hope that

More information

Let s start with occluding contours (or interior and exterior silhouettes), and look at image-space algorithms. A very simple technique is to render

Let s start with occluding contours (or interior and exterior silhouettes), and look at image-space algorithms. A very simple technique is to render 1 There are two major classes of algorithms for extracting most kinds of lines from 3D meshes. First, there are image-space algorithms that render something (such as a depth map or cosine-shaded model),

More information

TUTORIAL 7: Global Illumination & Ambient Occlusion

TUTORIAL 7: Global Illumination & Ambient Occlusion TUTORIAL 7: Global Illumination & Ambient Occlusion The goal of this short tutorial is to introduce a few select aspects of mental ray so that you may consider incorporating them in your projects if appropriate.

More information

Order Independent Transparency with Dual Depth Peeling. Louis Bavoil, Kevin Myers

Order Independent Transparency with Dual Depth Peeling. Louis Bavoil, Kevin Myers Order Independent Transparency with Dual Depth Peeling Louis Bavoil, Kevin Myers Document Change History Version Date Responsible Reason for Change 1.0 February 9 2008 Louis Bavoil Initial release Abstract

More information

The Graphics Pipeline and OpenGL I: Transformations!

The Graphics Pipeline and OpenGL I: Transformations! ! The Graphics Pipeline and OpenGL I: Transformations! Gordon Wetzstein! Stanford University! EE 267 Virtual Reality! Lecture 2! stanford.edu/class/ee267/!! Albrecht Dürer, Underweysung der Messung mit

More information

Computer Graphics. Lecture 14 Bump-mapping, Global Illumination (1)

Computer Graphics. Lecture 14 Bump-mapping, Global Illumination (1) Computer Graphics Lecture 14 Bump-mapping, Global Illumination (1) Today - Bump mapping - Displacement mapping - Global Illumination Radiosity Bump Mapping - A method to increase the realism of 3D objects

More information

Drawing Fast The Graphics Pipeline

Drawing Fast The Graphics Pipeline Drawing Fast The Graphics Pipeline CS559 Fall 2015 Lecture 9 October 1, 2015 What I was going to say last time How are the ideas we ve learned about implemented in hardware so they are fast. Important:

More information

Real-Time Shadows. Last Time? Today. Why are Shadows Important? Shadows as a Depth Cue. For Intuition about Scene Lighting

Real-Time Shadows. Last Time? Today. Why are Shadows Important? Shadows as a Depth Cue. For Intuition about Scene Lighting Last Time? Real-Time Shadows Today Why are Shadows Important? Shadows & Soft Shadows in Ray Tracing Planar Shadows Projective Texture Shadows Shadow Maps Shadow Volumes Why are Shadows Important? Depth

More information

Sparkling Effect. February 2007 WP _v01

Sparkling Effect. February 2007 WP _v01 White Paper Sparkling Effect February 2007 WP-03021-001_v01 White Paper Document Change History Version Date Responsible Reason for Change _v01 TL, TS Initial release Go to sdkfeedback@nvidia.com to provide

More information

CS 498 VR. Lecture 18-4/4/18. go.illinois.edu/vrlect18

CS 498 VR. Lecture 18-4/4/18. go.illinois.edu/vrlect18 CS 498 VR Lecture 18-4/4/18 go.illinois.edu/vrlect18 Review and Supplement for last lecture 1. What is aliasing? What is Screen Door Effect? 2. How image-order rendering works? 3. If there are several

More information

Real - Time Rendering. Graphics pipeline. Michal Červeňanský Juraj Starinský

Real - Time Rendering. Graphics pipeline. Michal Červeňanský Juraj Starinský Real - Time Rendering Graphics pipeline Michal Červeňanský Juraj Starinský Overview History of Graphics HW Rendering pipeline Shaders Debugging 2 History of Graphics HW First generation Second generation

More information

CS 130 Exam I. Fall 2015

CS 130 Exam I. Fall 2015 CS 130 Exam I Fall 2015 Name Student ID Signature You may not ask any questions during the test. If you believe that there is something wrong with a question, write down what you think the question is

More information

Ambient Occlusion Pass

Ambient Occlusion Pass Ambient Occlusion Pass (Soft Shadows in the Nooks and Crannies to Replicate Photorealistic Lighting) In this tutorial we are going to go over some advanced lighting techniques for an Ambient Occlusion

More information

Next-Generation Graphics on Larrabee. Tim Foley Intel Corp

Next-Generation Graphics on Larrabee. Tim Foley Intel Corp Next-Generation Graphics on Larrabee Tim Foley Intel Corp Motivation The killer app for GPGPU is graphics We ve seen Abstract models for parallel programming How those models map efficiently to Larrabee

More information

Real-Time Shadows. Last Time? Schedule. Questions? Today. Why are Shadows Important?

Real-Time Shadows. Last Time? Schedule. Questions? Today. Why are Shadows Important? Last Time? Real-Time Shadows The graphics pipeline Clipping & rasterization of polygons Visibility the depth buffer (z-buffer) Schedule Questions? Quiz 2: Thursday November 2 th, in class (two weeks from

More information

Getting fancy with texture mapping (Part 2) CS559 Spring Apr 2017

Getting fancy with texture mapping (Part 2) CS559 Spring Apr 2017 Getting fancy with texture mapping (Part 2) CS559 Spring 2017 6 Apr 2017 Review Skyboxes as backdrops Credits : Flipmode 3D Review Reflection maps Credits : NVidia Review Decal textures Credits : andreucabre.com

More information

CS 464 Review. Review of Computer Graphics for Final Exam

CS 464 Review. Review of Computer Graphics for Final Exam CS 464 Review Review of Computer Graphics for Final Exam Goal: Draw 3D Scenes on Display Device 3D Scene Abstract Model Framebuffer Matrix of Screen Pixels In Computer Graphics: If it looks right then

More information