TSBK03 ScreenSpace Ambient Occlusion

 Willis Jennings
 5 months ago
 Views:
Transcription
1 TSBK03 ScreenSpace Ambient Occlusion Joakim Gebart, Jimmy Liikala December 15, 2013 Contents 1 Abstract 1 2 History Crysis method Chosen method Algorithm outline Blur Range check Noise Depth buffer and perfragment normal vectors SSAO into the depth Retrieving a per fragment depth normalmap Reconstructing the depth Generate random points in hemisphere Computing the occlusion OpenGL Tessellation Wireframe using geometry shaders (GS) Result 8 1 Abstract Ambient occlusion methods are approximations of the shading that comes from ambient light being occluded by nearby geometry, this allows for a simulation of soft proximity of the middle frequency shadows. Ambient occlusion is used to scale the value of the ambient light in each point, which leads to that points that are less occluded will become brighter and occluded points will become darker because less light probably will hit the occluded spots. ScreenSpace Ambient Occlusion (SSAO) is a collection of methods where the calculations are done in screenspace, i.e. for each fragment. Calculating the ambient occlusion for every point on every surface would not be feasible in real time, but reducing the number of points to only the rendered pixels enables us to use these methods using today s GPU hardwaretechnology. 1
2 2 History Offline calculations of ambient occlusion have been possible for quite some time by generating rays of light to trace, but the first implementation of SSAO in real time in a larger game project was achieved by Crytek [X] in their engine named CryEngine2, used for the games such as Crysis (year 2007). Today (2012) every stateoftheart game engine has some form of ambient occlusion implementation. 2.1 Crysis method The algorithm used in Crysis samples a number of random points in a sphere centered around each rendered point. The occlusion factor is calculated from the number of those samples that are behind geometry, where the depth value in the sampled point is greater than the value of the depth buffer. Since all modern game engines already do additional postprocessing steps where the depth data will be available for no extra render cost since it is already being used by other parts of the postprocessing shaders. The occlusion factor is used to darken the occluded parts of the image in a postprocessing stage. Because the random points are sampled from a sphere in the Crysis method there will in most cases be points that are inside geometry, which makes convex surfaces appear brighter than flat walls in addition to darkening concave surfaces. This effect is not photorealistic but it can be mitigated by choosing samples in other ways, but this is also something that gives the graphic a special touch/feeling that some people tend to enjoy. 3 Chosen method The method of choice for the computation of the ambient occlusion was chosen as a variant of the Crysis method, where the random samples are picked inside a normaloriented hemisphere (see figure 1) on the rendered surface instead of a sphere as in the original implementation used by Crytek. This sampling makes flat surfaces have the same occlusion factor as convex surfaces while still darkening concave surfaces, resulting in a somewhat more realistic appearance. Figure 1: Sample points from inside a hemisphere. 2
3 One drawback of this method is that the surface normal needs to be computed per fragment, if not already available. However, since surface normals are usually needed for per pixel lighting or some other postprocessing effects, they will most likely already be available for no extra cost in a real life situation when using a modern graphics engine. 3.1 Algorithm outline The algorithm is implemented as follows: For each fragment: Generate a number of random sample points in a hemisphere around the fragment. Project the sample points into screen space to find the matching value in the depth buffer. Compare depth buffer value against the depth value of the sample point, if the sample point depth value is greater than the depth buffer value the sample point is occluded, increment occlusion factor. The occlusion factor can then be used in a postprocessing step to achieve ambient occlusion. 3.2 Blur In order to keep framerates interactive the number of random samples have to be kept to a minimum. The result of this is that the number of bits of numeric precision on the occlusion factor will be less than what is required for nice, smooth shading. To reduce the problem, a blur is applied to the occlusion factor buffer before combining the occlusion factor with the postprocessing input image, which yields a smoother shading of the image. 3.3 Range check If the blur is applied to the occlusion factor buffer as a whole without any conditions the result would be that shadows bloom and can darken parts that are far behind or in front of the actual darkened corner. In order to eliminate this effect, a condition is added to the blur shader that computes the zdistance between two points in the occlusion factor buffer and only apply blur between points that are close to each other in the zdirection. 3.4 Noise It is difficult to efficiently generate random numbers in a shader program. Therefore, instead of performing expensive mathematical operations for each fragment, a texture containing random white noise is used as a source of random numbers. By using the fragment coordinate as an offset into the texture a random number can be generated that is not the same across all fragments. The fragment coordinate can also be multiplied with different prime factors on each random draw in the shader code to achieve a period that is longer than the size of the texture. The random numbers are used to build vectors that form the points in the sampling hemisphere. The repeating pattern of the random texture will cause visible repetition in the result. To further increase the period of the pattern, the sampling vectors are also rotated along the normal axis using a value from the random texture as well. 3
4 3.5 Depth buffer and perfragment normal vectors The depth buffer must be rendered to a framebuffer object (FBO) before computing the occlusion factor, because of the coupling between the occlusion factor and nearby fragments depth value. The normals which in the threedimensional world has has three components are saved in the red, green and blue channel of the FBO, while the depth values are stored in the alpha channel. One important thing regarding the depth is that the depthbuffer (or often referred to as the zbuffer) is not linear. The zbuffer has higher resolution (or more samples if you look at it that way) closer to the camera and resolution decreases the further away from the camera it goes. The linear depth is in this case obtained first by computing the pervertex depth which is found by applying the Modelview matrix on each vertex (in the vertexshader) and thereafter the depth is saved into the alpha channel in the fragmentshader, depth values in between vertexpoints will be interpolated in the fragment shader. The depth values are normalized by dividing the depth by the distance between the nearplane and farplane of the camera, this is done to simplify debugging, figure 2 shows an example of a normalized, linear depth buffer. Figure 2: Depth buffer with normalized values. One important thing regarding normals are to keep them in correct space, no translation or perspective division should be applied to the normals. The normal should stay perpendicular to the the surface. This is achieved by applying the inverse transpose of the upper left part of the Modelview matrix (also known as the Normal matrix) on the normals. 4
5 4 SSAO into the depth The Screen Space Ambient Occlusion is in this case (and in most cases) computed on the Graphic Processing Unit (GPU) using the OpenGL Shading Language (GLSL), which enables us to make realtime calculations at high speed. In this chapter the SSAO algorithm of choice is described in better detail. Screen Space Ambient occlusion is in this chapter explained and divided into the following steps: Retrieving a per fragment depth normalmap Reconstructing the depth Generate random points in hemisphere Computing the occlusion 4.1 Retrieving a per fragment depth normalmap The first thing that had to be found and stored for later use is the the depthmap from the cameras point of view, often referred as cameraspace, viewspace or eyespace. The depth were to be stored in a texture, at the same time also the normals were saved for later use. In this case the texture used is a RGBAtexture contained four channels Red, Green, Blue and Alpha. The normals which in the threedimensional world has has three components were saved in the red, green and blue channel of the texture, while the depth were stored in the alpha channel. One important thing regarding the depth is that the depthbuffer (or often referred to as the zbuffer) is not linear. The zbuffer has higher resolution (or more samples if you look at it that way) closer to the camera and continues to decrease the further away from the camera it goes. The higher resolution closer to the camera is because it is more important that everything looks correct closer to the camera, or in clear words, it is more important that the zbuffer culling is correct close to the camera so we will minimize the risk to see artifact due to the culling. The linear depth is in this case obtained first by computing the pervertex depth which is found by applying the Modelview matrix on each vertex (in the vertexshader) and thereafter the depth is saved into the alpha channel in the fragmentshader, depth values in between vertexpoints will be interpolated in the fragment shader. If normalizing the depth values by dividing the depth by the distance between the nearplane and farplane of the camera values between zero and one will be obtained for the depth, which makes it rather simple to use the depth as output to the screen, just to confirm it is correct. In OpenGL the obtained depth values are also multiplied with minus one, because in OpenGL we look in the negative zdirection. In cases everything is done correctly the output should look something like in figure 2, or similar (depending on the scene). One important thing regarding normals are to keep them in correct space, no translation or perspective division should be applied to the normals. The normal should stay perpendicular to the the surface. This can be achieved by applying the inverse transpose of the upper left part of the Modelviewmatrix figure 3 on the normals. 4.2 Reconstructing the depth To obtain the correct depthvalue for each pixel a quad is used as input to the SSAOshader. The quad is built up by four vertices, according to figure 4 which in the vertexshader are stored as 5
6 Figure 3: llustration of the upper left part of the modelviewmatrix (marked as M1) that should be used to create the normal matrix. Figure 4: Illustrates the quad and its positions of the vertex points that are to be used to access a pixels correct corresponding depthvalue from the texture. texturecoordinates to obtain the correct depth value corresponding to each pixel. The screencoordinates in GLSL are always between 1.0 and 1.0 in the xyplane, therefore the glposition that is stored in the vertexshader has to be transformed into that range figure 5. At this stage the fragmentshader is now able interpolating the correct texturecoordinate for the correct corresponding pixel on the screen. In the fragmentshdaer step the depth is retrived from the depthtexture, using the obtained coordinates and thereafter adding up the the nearviewplane distance to obtain the exact correct depth. Adding up the distance to the nearviewplane is not absolutely necessary since that distance is the same to each pixel. 4.3 Generate random points in hemisphere Generating random point in a hemispehere around each fragmentpoint can also be achived in many different ways, in this case a noisytexture figure 6 is used to obtain randomized values. Each pixel of the texture containes three components (red, green blue), which are used as x,y and zcomponents to build the randomvector. Since the texture has values between 0 and 1, 6
7 Figure 5: Illustrates the transformed vertex points of the quad that are used as glposition in the vertexshader in order to get the correct interpolated coordinates in the fragmentshader. Figure 6: The noisytexture used to obtain randomized points inside a a hemisphere each value is multiplied by 2 and thereafter 1 is subracted to obtain values inbetween 1 and 1. Once the randomvector is obtained it will be a random point inside a sphere. In order to make it to be inside a hemisphere aligning the normal the randomvector has to be substracted by normal (normal randomv ector). At this point a randompoint is obtained inside a hemisphere aligned to the normal, now a crossproduct is made between the normal and the randomvector to obtain a new point inside the hemisphere which is choosen to be called v2.. Now three perpendicular vectors are obtained inside a hemisphere aligned to the normal which are the normal itself, the random vector and v2. These three vectors are used as a new basis in the next step, to easily generate several random positions inside the hemisphere. 7
8 4.4 Computing the occlusion 5 OpenGL 4 Another goal of the project was to utilize some new features of the OpenGL 4 pipeline. The most prominent such new features are the tessellation shader stages that can be used for onthefly geometry modifications. 5.1 Tessellation A new feature of OpenGL 4 and DirectX 11 is the new tessellation pipeline. The tessellation in OpenGL 4 is designed to do surface subdivisions in hardware in order to get a mesh with higher resolution, which can then be transformed in various ways; one straightforward example is the use of a displacement map to generate surface features such as for example the grooves and ridges between stones in a stone wall. An appreciated feature of the tessellation pipeline is that the level of subdivision can be controlled per primitive, therefore, it is possible to have a smooth levelofdetail transition based on distance from the eye. A user controlled implementation of the tessellation level has been implemented for purpose of testing the new tessellation features of the pipeline. The subdivision levels can be altered by pressing keys on the keyboard in the example implementation. 5.2 Wireframe using geometry shaders (GS) The use of geometry shaders should be minimized as much as possible, due to performance. In some cases a geometry shader may be the way to go, like in the case of building up a wireframe. In a geometry shaders access can to neighbouring primitives can be obtained, which can be useful for example in a case when building up a wireframe. In order to evaluate the tessellation result, it is useful to be able to view the wireframe of the resulting mesh. A geometry shader was developed which allows the user to display both the original mesh and the subdivided mesh, in separate colours, to distinguish them from each other. This is achieved by computing the distance from the fragment to the triangle edge, and the distance from the fragment to the nearest subdivided triangle edge. 6 Result Figure 7 shows an example of the final result of the SSAO shading. 8
9 Figure 7: Example of the SSAO shading 9
Screen Space Ambient Occlusion TSBK03: Advanced Game Programming
Screen Space Ambient Occlusion TSBK03: Advanced Game Programming August NamKi Ek, Oscar Johnson and Ramin Assadi March 5, 2015 This project report discusses our approach of implementing Screen Space Ambient
More informationCMSC427 Advanced shading getting global illumination by local methods. Credit: slides Prof. Zwicker
CMSC427 Advanced shading getting global illumination by local methods Credit: slides Prof. Zwicker Topics Shadows Environment maps Reflection mapping Irradiance environment maps Ambient occlusion Reflection
More informationRendering Algorithms: Realtime indirect illumination. Spring 2010 Matthias Zwicker
Rendering Algorithms: Realtime indirect illumination Spring 2010 Matthias Zwicker Today Realtime indirect illumination Ray tracing vs. Rasterization Screen space techniques Visibility & shadows Instant
More informationComputer Graphics Shadow Algorithms
Computer Graphics Shadow Algorithms Computer Graphics Computer Science Department University of Freiburg WS 11 Outline introduction projection shadows shadow maps shadow volumes conclusion Motivation shadows
More informationOrthogonal Projection Matrices. Angel and Shreiner: Interactive Computer Graphics 7E AddisonWesley 2015
Orthogonal Projection Matrices 1 Objectives Derive the projection matrices used for standard orthogonal projections Introduce oblique projections Introduce projection normalization 2 Normalization Rather
More informationRasterization Overview
Rendering Overview The process of generating an image given a virtual camera objects light sources Various techniques rasterization (topic of this course) raytracing (topic of the course Advanced Computer
More informationDirect Rendering of Trimmed NURBS Surfaces
Direct Rendering of Trimmed NURBS Surfaces Hardware Graphics Pipeline 2/ 81 Hardware Graphics Pipeline GPU Video Memory CPU Vertex Processor Raster Unit Fragment Processor Render Target Screen Extended
More informationDeferred Rendering Due: Wednesday November 15 at 10pm
CMSC 23700 Autumn 2017 Introduction to Computer Graphics Project 4 November 2, 2017 Deferred Rendering Due: Wednesday November 15 at 10pm 1 Summary This assignment uses the same application architecture
More informationDeus Ex is in the Details
Deus Ex is in the Details Augmenting the PC graphics of Deus Ex: Human Revolution using DirectX 11 technology Matthijs De Smedt Graphics Programmer, Nixxes Software Overview Introduction DirectX 11 implementation
More informationC P S C 314 S H A D E R S, O P E N G L, & J S RENDERING PIPELINE. Mikhail Bessmeltsev
C P S C 314 S H A D E R S, O P E N G L, & J S RENDERING PIPELINE UGRAD.CS.UBC.C A/~CS314 Mikhail Bessmeltsev 1 WHAT IS RENDERING? Generating image from a 3D scene 2 WHAT IS RENDERING? Generating image
More information3D Programming. 3D Programming Concepts. Outline. 3D Concepts. 3D Concepts  Coordinate Systems. 3D Concepts Displaying 3D Models
3D Programming Concepts Outline 3D Concepts Displaying 3D Models 3D Programming CS 4390 3D Computer 1 2 3D Concepts 3D Model is a 3D simulation of an object. Coordinate Systems 3D Models 3D Shapes 3D Concepts
More informationScene Management. Video Game Technologies 11498: MSc in Computer Science and Engineering 11156: MSc in Game Design and Development
Video Game Technologies 11498: MSc in Computer Science and Engineering 11156: MSc in Game Design and Development Chap. 5 Scene Management Overview Scene Management vs Rendering This chapter is about rendering
More informationAdvanced Lighting Techniques Due: Monday November 2 at 10pm
CMSC 23700 Autumn 2015 Introduction to Computer Graphics Project 3 October 20, 2015 Advanced Lighting Techniques Due: Monday November 2 at 10pm 1 Introduction This assignment is the third and final part
More informationSurface Graphics. 200 polys 1,000 polys 15,000 polys. an empty foot.  a mesh of spline patches:
Surface Graphics Objects are explicitely defined by a surface or boundary representation (explicit inside vs outside) This boundary representation can be given by:  a mesh of polygons: 200 polys 1,000
More informationgraphics pipeline computer graphics graphics pipeline 2009 fabio pellacini 1
graphics pipeline computer graphics graphics pipeline 2009 fabio pellacini 1 graphics pipeline sequence of operations to generate an image using objectorder processing primitives processed oneatatime
More informationShaders. Slide credit to Prof. Zwicker
Shaders Slide credit to Prof. Zwicker 2 Today Shader programming 3 Complete model Blinn model with several light sources i diffuse specular ambient How is this implemented on the graphics processor (GPU)?
More informationChapter 17: The Truth about Normals
Chapter 17: The Truth about Normals What are Normals? When I first started with Blender I read about normals everywhere, but all I knew about them was: If there are weird black spots on your object, go
More informationE.Order of Operations
Appendix E E.Order of Operations This book describes all the performed between initial specification of vertices and final writing of fragments into the framebuffer. The chapters of this book are arranged
More informationOrder Independent Transparency with Dual Depth Peeling. Louis Bavoil, Kevin Myers
Order Independent Transparency with Dual Depth Peeling Louis Bavoil, Kevin Myers Document Change History Version Date Responsible Reason for Change 1.0 February 9 2008 Louis Bavoil Initial release Abstract
More informationComputer Graphics. Lecture 9 Environment mapping, Mirroring
Computer Graphics Lecture 9 Environment mapping, Mirroring Today Environment Mapping Introduction Cubic mapping Sphere mapping refractive mapping Mirroring Introduction reflection first stencil buffer
More informationInterpolation using scanline algorithm
Interpolation using scanline algorithm Idea: Exploit knowledge about already computed color values. Traverse projected triangle topdown using scanline. Compute start and end color value of each pixel
More informationComputer Viewing. CS 537 Interactive Computer Graphics Prof. David E. Breen Department of Computer Science
Computer Viewing CS 537 Interactive Computer Graphics Prof. David E. Breen Department of Computer Science 1 Objectives Introduce the mathematics of projection Introduce OpenGL viewing functions Look at
More informationDominic Filion, Senior Engineer Blizzard Entertainment. Rob McNaughton, Lead Technical Artist Blizzard Entertainment
Dominic Filion, Senior Engineer Blizzard Entertainment Rob McNaughton, Lead Technical Artist Blizzard Entertainment Screenspace techniques Deferred rendering Screenspace ambient occlusion Depth of Field
More informationTUTORIAL 7: Global Illumination & Ambient Occlusion
TUTORIAL 7: Global Illumination & Ambient Occlusion The goal of this short tutorial is to introduce a few select aspects of mental ray so that you may consider incorporating them in your projects if appropriate.
More informationThe Traditional Graphics Pipeline
Last Time? The Traditional Graphics Pipeline Reading for Today A Practical Model for Subsurface Light Transport, Jensen, Marschner, Levoy, & Hanrahan, SIGGRAPH 2001 Participating Media Measuring BRDFs
More informationReal Time Reflections HanWei Shen
Real Time Reflections HanWei Shen Reflections One of the most noticeable effect of interobject lighting Direct calculation of the physics (ray tracing) is too expensive Our focus is to capture the most
More informationTopic 10: Scene Management, Particle Systems and Normal Mapping. CITS4242: Game Design and Multimedia
CITS4242: Game Design and Multimedia Topic 10: Scene Management, Particle Systems and Normal Mapping Scene Management Scene management means keeping track of all objects in a scene.  In particular, keeping
More informationSparkling Effect. February 2007 WP _v01
White Paper Sparkling Effect February 2007 WP03021001_v01 White Paper Document Change History Version Date Responsible Reason for Change _v01 TL, TS Initial release Go to sdkfeedback@nvidia.com to provide
More informationThe Graphics Pipeline and OpenGL I: Transformations!
! The Graphics Pipeline and OpenGL I: Transformations! Gordon Wetzstein! Stanford University! EE 267 Virtual Reality! Lecture 2! stanford.edu/class/ee267/!! Albrecht Dürer, Underweysung der Messung mit
More informationComputer Graphics I. Assignment 3
UNIVERSITÄT DES SAARLANDES Dr.Ing. Hendrik P.A. Lensch Max Planck Institut Informatik Art Tevs (tevs@mpiinf.mpg.de) Boris Ajdin (bajdin@mpiinf.mpg.de) Matthias Hullin (hullin@mpiinf.mpg.de) 12. November
More information03 RENDERING PART TWO
03 RENDERING PART TWO WHAT WE HAVE SO FAR: GEOMETRY AFTER TRANSFORMATION AND SOME BASIC CLIPPING / CULLING TEXTURES AND MAPPING MATERIAL VISUALLY DISTINGUISHES 2 OBJECTS WITH IDENTICAL GEOMETRY FOR NOW,
More informationRobust Stencil Shadow Volumes. CEDEC 2001 Tokyo, Japan
Robust Stencil Shadow Volumes CEDEC 2001 Tokyo, Japan Mark J. Kilgard Graphics Software Engineer NVIDIA Corporation 2 Games Begin to Embrace Robust Shadows 3 John Carmack s new Doom engine leads the way
More informationLecture 13: Reyes Architecture and Implementation. Kayvon Fatahalian CMU : Graphics and Imaging Architectures (Fall 2011)
Lecture 13: Reyes Architecture and Implementation Kayvon Fatahalian CMU 15869: Graphics and Imaging Architectures (Fall 2011) A gallery of images rendered using Reyes Image credit: Lucasfilm (Adventures
More informationCould you make the XNA functions yourself?
1 Could you make the XNA functions yourself? For the second and especially the third assignment, you need to globally understand what s going on inside the graphics hardware. You will write shaders, which
More informationhttps://ilearn.marist.edu/xslportal/tool/d4e4fd3aa3...
Assessment Preview  This is an example student view of this assessment done Exam 2 Part 1 of 5  Modern Graphics Pipeline Question 1 of 27 Match each stage in the graphics pipeline with a description
More informationThe Graphics Pipeline and OpenGL I: Transformations!
! The Graphics Pipeline and OpenGL I: Transformations! Gordon Wetzstein! Stanford University! EE 267 Virtual Reality! Lecture 2! stanford.edu/class/ee267/!! Logistics Update! all homework submissions:
More informationMethodology for Lecture. Importance of Lighting. Outline. Shading Models. Brief primer on Color. Foundations of Computer Graphics (Spring 2010)
Foundations of Computer Graphics (Spring 2010) CS 184, Lecture 11: OpenGL 3 http://inst.eecs.berkeley.edu/~cs184 Methodology for Lecture Lecture deals with lighting (teapot shaded as in HW1) Some Nate
More informationProject report Augmented reality with ARToolKit
Project report Augmented reality with ARToolKit FMA175 Image Analysis, Project Mathematical Sciences, Lund Institute of Technology Supervisor: Petter Strandmark Fredrik Larsson (dt07fl2@student.lth.se)
More informationEvolution of GPUs Chris Seitz
Evolution of GPUs Chris Seitz Overview Concepts: Realtime rendering Hardware graphics pipeline Evolution of the PC hardware graphics pipeline: 19951998: Texture mapping and zbuffer 1998: Multitexturing
More informationCS770/870 Fall 2015 Advanced GLSL
Expanded Graphics Pipeline CS770/870 Fall 2015 Advanced GLSL Geometry definition Shaders can actually be inserted in more places in pipeline Vertex and fragment shaders are most important Vertex shader
More information Rasterization. Geometry. Scan Conversion. Rasterization
Computer Graphics  The graphics pipeline  Geometry Modelview Geometry Processing Lighting Perspective Clipping Scan Conversion Texturing Fragment Tests Blending Framebuffer Fragment Processing  So far,
More informationShadow Algorithms. CSE 781 Winter HanWei Shen
Shadow Algorithms CSE 781 Winter 2010 HanWei Shen Why Shadows? Makes 3D Graphics more believable Provides additional cues for the shapes and relative positions of objects in 3D What is shadow? Shadow:
More informationImage Precision Silhouette Edges
Image Precision Silhouette Edges by Ramesh Raskar and Michael Cohen Presented at I3D 1999 Presented by Melanie Coggan Outline Motivation Previous Work Method Results Conclusions Outline Motivation Previous
More informationShading and Illumination
Shading and Illumination OpenGL Shading Without Shading With Shading Physics Bidirectional Reflectance Distribution Function (BRDF) f r (ω i,ω ) = dl(ω ) L(ω i )cosθ i dω i = dl(ω ) L(ω i )( ω i n)dω
More informationGLSL Applications: 2 of 2
Administrivia GLSL Applications: 2 of 2 Patrick Cozzi University of Pennsylvania CIS 565  Spring 2011 Assignment 2 due today 11:59pm on Blackboard Assignment 3 handed out today Due Wednesday, 02/09 at
More informationLets assume each object has a defined colour. Hence our illumination model is looks unrealistic.
Shading Models There are two main types of rendering that we cover, polygon rendering ray tracing Polygon rendering is used to apply illumination models to polygons, whereas ray tracing applies to arbitrary
More informationPowerVR Hardware. Architecture Overview for Developers
Public Imagination Technologies PowerVR Hardware Public. This publication contains proprietary information which is subject to change without notice and is supplied 'as is' without warranty of any kind.
More informationModels and Architectures. Ed Angel Professor of Computer Science, Electrical and Computer Engineering, and Media Arts University of New Mexico
Models and Architectures Ed Angel Professor of Computer Science, Electrical and Computer Engineering, and Media Arts University of New Mexico 1 Objectives Learn the basic design of a graphics system Introduce
More informationLecture 2. Shaders, GLSL and GPGPU
Lecture 2 Shaders, GLSL and GPGPU Is it interesting to do GPU computing with graphics APIs today? Lecture overview Why care about shaders for computing? Shaders for graphics GLSL Computing with shaders
More informationOpenGL: Open Graphics Library. Introduction to OpenGL Part II. How do I render a geometric primitive? What is OpenGL
OpenGL: Open Graphics Library Introduction to OpenGL Part II CS 35150 Graphics API ( Application Programming Interface) Software library Layer between programmer and graphics hardware (and other software
More informationCSE 167: Lecture #8: GLSL. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2012
CSE 167: Introduction to Computer Graphics Lecture #8: GLSL Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2012 Announcements Homework project #4 due Friday, November 2 nd Introduction:
More informationRenderToTexture Caching. D. Sim Dietrich Jr.
RenderToTexture Caching D. Sim Dietrich Jr. What is RenderToTexture Caching? Pixel shaders are becoming more complex and expensive Perpixel shadows Dynamic Normal Maps Bullet holes Water simulation
More informationOPENGL RENDERING PIPELINE
CPSC 314 03 SHADERS, OPENGL, & JS UGRAD.CS.UBC.CA/~CS314 Textbook: Appendix A* (helpful, but different version of OpenGL) Alla Sheffer Sep 2016 OPENGL RENDERING PIPELINE 1 OPENGL RENDERING PIPELINE Javascript
More informationCS GPU and GPGPU Programming Lecture 7: Shading and Compute APIs 1. Markus Hadwiger, KAUST
CS 380  GPU and GPGPU Programming Lecture 7: Shading and Compute APIs 1 Markus Hadwiger, KAUST Reading Assignment #4 (until Feb. 23) Read (required): Programming Massively Parallel Processors book, Chapter
More informationThe Rendering Pipeline (1)
The Rendering Pipeline (1) Alessandro Martinelli alessandro.martinelli@unipv.it 30 settembre 2014 The Rendering Pipeline (1) Rendering Architecture First Rendering Pipeline Second Pipeline: Illumination
More informationDeferred Renderer Proof of Concept Report
Deferred Renderer Proof of Concept Report Octavian Mihai Vasilovici 28 March 2013 Bournemouth University 1. Summary This document aims at explaining the methods decide to be used in creating a deferred
More informationCSE 167: Lecture #5: Rasterization. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2012
CSE 167: Introduction to Computer Graphics Lecture #5: Rasterization Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2012 Announcements Homework project #2 due this Friday, October
More informationScanline Rendering 2 1/42
Scanline Rendering 2 1/42 Review 1. Set up a Camera the viewing frustum has near and far clipping planes 2. Create some Geometry made out of triangles 3. Place the geometry in the scene using Transforms
More informationComputer Graphics 1. Chapter 7 (June 17th, 2010, 24pm): Shading and rendering. LMU München Medieninformatik Andreas Butz Computergraphik 1 SS2010
Computer Graphics 1 Chapter 7 (June 17th, 2010, 24pm): Shading and rendering 1 The 3D rendering pipeline (our version for this class) 3D models in model coordinates 3D models in world coordinates 2D Polygons
More informationComputer Science 175. Introduction to Computer Graphics lib175 time: m/w 2:304:00 pm place:md g125 section times: tba
Computer Science 175 Introduction to Computer Graphics www.fas.harvard.edu/ lib175 time: m/w 2:304:00 pm place:md g125 section times: tba Instructor: Steven shlomo Gortler www.cs.harvard.edu/ sjg sjg@cs.harvard.edu
More informationCPSC 314 LIGHTING AND SHADING
CPSC 314 LIGHTING AND SHADING UGRAD.CS.UBC.CA/~CS314 slide credits: Mikhail Bessmeltsev et al 1 THE RENDERING PIPELINE Vertices and attributes Vertex Shader Modelview transform Pervertex attributes Vertex
More informationShader Series Primer: Fundamentals of the Programmable Pipeline in XNA Game Studio Express
Shader Series Primer: Fundamentals of the Programmable Pipeline in XNA Game Studio Express Level: Intermediate Area: Graphics Programming Summary This document is an introduction to the series of samples,
More informationTriangle Rasterization
Triangle Rasterization Computer Graphics COMP 770 (236) Spring 2007 Instructor: Brandon Lloyd 2/07/07 1 From last time Lines and planes Culling View frustum culling Backface culling Occlusion culling
More informationShadows. RealTime Hard Shadows. Collected by Ronen Gvili. Most slides were taken from : Fredu Durand Stefan Brabec
Shadows RealTime Hard Shadows Collected by Ronen Gvili. Most slides were taken from : Fredu Durand Stefan Brabec Hard Shadows Planar (Projected) Shadows Shadow Maps Volume (Stencil) Shadows 1 Soft Shadows
More informationI expect to interact in class with the students, so I expect students to be engaged. (no laptops, smartphones,...) (fig)
Computer Science 175 Introduction to Computer Graphics www.fas.harvard.edu/ lib175 time: m/w 2:304:00 pm place:md g125 section times: tba Instructor: Steven shlomo Gortler www.cs.harvard.edu/ sjg sjg@cs.harvard.edu
More informationCSE 167: Introduction to Computer Graphics Lecture #10: View Frustum Culling
CSE 167: Introduction to Computer Graphics Lecture #10: View Frustum Culling Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2015 Announcements Project 4 due tomorrow Project
More informationSpring 2009 Prof. Hyesoon Kim
Spring 2009 Prof. Hyesoon Kim Application Geometry Rasterizer CPU Each stage cane be also pipelined The slowest of the pipeline stage determines the rendering speed. Frames per second (fps) Executes on
More informationDiFi: Distance Fields  Fast Computation Using Graphics Hardware
DiFi: Distance Fields  Fast Computation Using Graphics Hardware Avneesh Sud Dinesh Manocha UNCChapel Hill http://gamma.cs.unc.edu/difi Distance Fields Distance Function For a site a scalar function f:r
More informationCopyright Khronos Group 2012 Page 1. Teaching GL. Dave Shreiner Director, Graphics and GPU Computing, ARM 1 December 2012
Copyright Khronos Group 2012 Page 1 Teaching GL Dave Shreiner Director, Graphics and GPU Computing, ARM 1 December 2012 Copyright Khronos Group 2012 Page 2 Agenda Overview of OpenGL family of APIs Comparison
More informationVolume Shadows Tutorial Nuclear / the Lab
Volume Shadows Tutorial Nuclear / the Lab Introduction As you probably know the most popular rendering technique, when speed is more important than quality (i.e. realtime rendering), is polygon rasterization.
More informationClipping. Angel and Shreiner: Interactive Computer Graphics 7E AddisonWesley 2015
Clipping 1 Objectives Clipping lines First of implementation algorithms Clipping polygons (next lecture) Focus on pipeline plus a few classic algorithms 2 Clipping 2D against clipping window 3D against
More informationWhite Paper. Solid Wireframe. February 2007 WP _v01
White Paper Solid Wireframe February 2007 WP03014001_v01 White Paper Document Change History Version Date Responsible Reason for Change _v01 SG, TS Initial release Go to sdkfeedback@nvidia.com to provide
More informationFall CSCI 420: Computer Graphics. 7.1 Rasterization. Hao Li.
Fall 2015 CSCI 420: Computer Graphics 7.1 Rasterization Hao Li http://cs420.haoli.com 1 Rendering Pipeline 2 Outline Scan Conversion for Lines Scan Conversion for Polygons Antialiasing 3 Rasterization
More information4) Finish the spline here. To complete the spline, double click the last point or select the spline tool again.
1) Select the line tool 3) Move the cursor along the X direction (be careful to stay on the X axis alignment so that the line is perpendicular) and click for the second point of the line. Type 0.5 for
More informationGraphics Hardware and OpenGL
Graphics Hardware and OpenGL Ubi Soft, Prince of Persia: The Sands of Time What does graphics hardware have to do fast? Camera Views Different views of an object in the world 1 Camera Views Lines from
More informationThe Rasterization Pipeline
Lecture 5: The Rasterization Pipeline Computer Graphics and Imaging UC Berkeley CS184/284A, Spring 2016 What We ve Covered So Far z x y z x y (0, 0) (w, h) Position objects and the camera in the world
More informationComputer Graphics Lecture 11
1 / 14 Computer Graphics Lecture 11 Dr. Marc Eduard Frîncu West University of Timisoara May 15th 2012 2 / 14 Outline 1 Introduction 2 Transparency 3 Reflection 4 Recap 3 / 14 Introduction light = local
More informationThree Main Themes of Computer Graphics
Three Main Themes of Computer Graphics Modeling How do we represent (or model) 3D objects? How do we construct models for specific objects? Animation How do we represent the motion of objects? How do
More informationCSE 690: GPGPU. Lecture 2: Understanding the Fabric  Intro to Graphics. Klaus Mueller Stony Brook University Computer Science Department
CSE 690: GPGPU Lecture 2: Understanding the Fabric  Intro to Graphics Klaus Mueller Stony Brook University Computer Science Department Klaus Mueller, Stony Brook 2005 1 Surface Graphics Objects are explicitely
More informationGLSL v1.20. Scott MacHaffie Schrödinger, Inc.
1 GLSL v1.20 Scott MacHaffie Schrödinger, Inc. http://www.schrodinger.com Table of Contents Introduction...2 Example 01: Trivial shader...2 Syntax...3 Types of variables...3 Example 02: Materials vertex
More informationCOSC 448: REALTIME INDIRECT ILLUMINATION
U B C O K A N A G A N Department of Computer Science COSC 448: REALTIME INDIRECT ILLUMINATION Written by Stephen Smithbower Supersor: Dr. Ramon Lawrence January 2010  April 2010 University of British
More information6.837 Introduction to Computer Graphics Assignment 5: OpenGL and Solid Textures Due Wednesday October 22, 2003 at 11:59pm
6.837 Introduction to Computer Graphics Assignment 5: OpenGL and Solid Textures Due Wednesday October 22, 2003 at 11:59pm In this assignment, you will add an interactive preview of the scene and solid
More informationPhysicallyBased Laser Simulation
PhysicallyBased Laser Simulation Greg Reshko Carnegie Mellon University reshko@cs.cmu.edu Dave Mowatt Carnegie Mellon University dmowatt@andrew.cmu.edu Abstract In this paper, we describe our work on
More informationCS 184: Assignment 2 Scene Viewer
CS 184: Assignment 2 Scene Viewer Ravi Ramamoorthi 1 Goals and Motivation This is a more substantial assignment than homework 1, including more transformations, shading, and a viewer for a scene specified
More informationDrawing Fast The Graphics Pipeline
Drawing Fast The Graphics Pipeline CS559 Fall 2016 Lectures 10 & 11 October 10th & 12th, 2016 1. Put a 3D primitive in the World Modeling 2. Figure out what color it should be 3. Position relative to the
More informationChapter 6 Lighting and Cameras
Lighting Types and Settings When you create a scene in Blender, you start with a few basic elements that will include a camera, but may or may not include a light. Remember that what the camera sees is
More informationCS450/550. Pipeline Architecture. Adapted From: Angel and Shreiner: Interactive Computer Graphics6E AddisonWesley 2012
CS450/550 Pipeline Architecture Adapted From: Angel and Shreiner: Interactive Computer Graphics6E AddisonWesley 2012 0 Objectives Learn the basic components of a graphics system Introduce the OpenGL pipeline
More informationCS4621/5621 Fall Computer Graphics Practicum Intro to OpenGL/GLSL
CS4621/5621 Fall 2015 Computer Graphics Practicum Intro to OpenGL/GLSL Professor: Kavita Bala Instructor: Nicolas Savva with slides from Balazs Kovacs, Eston Schweickart, Daniel Schroeder, Jiang Huang
More informationClipping. CSC 7443: Scientific Information Visualization
Clipping Clipping to See Inside Obscuring critical information contained in a volume data Contour displays show only exterior visible surfaces Isosurfaces can hide other isosurfaces Other displays can
More informationGraphics Hardware. Instructor Stephen J. Guy
Instructor Stephen J. Guy Overview What is a GPU Evolution of GPU GPU Design Modern Features Programmability! Programming Examples Overview What is a GPU Evolution of GPU GPU Design Modern Features Programmability!
More informationOpenGL refresher. Advanced Computer Graphics 2012
Advanced Computer Graphics 2012 What you will see today Outline General OpenGL introduction Setting up: GLUT and GLEW Elementary rendering Transformations in OpenGL Texture mapping Programmable shading
More informationParallelizing Graphics Pipeline Execution (+ Basics of Characterizing a Rendering Workload)
Lecture 2: Parallelizing Graphics Pipeline Execution (+ Basics of Characterizing a Rendering Workload) Visual Computing Systems Today Finishing up from last time Brief discussion of graphics workload metrics
More informationCSE 167: Lecture #8: Lighting. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2011
CSE 167: Introduction to Computer Graphics Lecture #8: Lighting Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2011 Announcements Homework project #4 due Friday, October 28 Introduction:
More informationParallel Triangle Rendering on a Modern GPU
Lecture 27: Parallel Triangle Rendering on a Modern GPU Parallel Computer Architecture and Programming CMU 15418/15618, Spring 2015 Let s draw a triangle on the screen Question 1: what pixels does the
More informationCSE528 Computer Graphics: Theory, Algorithms, and Applications
CSE528 Computer Graphics: Theory, Algorithms, and Applications Hong Qin State University of New York at Stony Brook (Stony Brook University) Stony Brook, New York 117944400 Tel: (631)6328450; Fax: (631)6328334
More informationThe Rasterization Pipeline
Lecture 5: The Rasterization Pipeline Computer Graphics and Imaging UC Berkeley What We ve Covered So Far z x y z x y (0, 0) (w, h) Position objects and the camera in the world Compute position of objects
More informationMassively Parallel Non Convex Optimization on the GPU Through the Graphics Pipeline
Massively Parallel Non Convex Optimization on the GPU Through the Graphics Pipeline By Peter Cottle Department of Mechanical Engineering, 6195 Etcheverry Hall, University of California, Berkeley, CA 947201740,
More information6.837 Introduction to Computer Graphics Assignment 5: OpenGL and Solid Textures Due Wednesday October 22, 2003 at 11:59pm
6.837 Introduction to Computer Graphics Assignment 5: OpenGL and Solid Textures Due Wednesday October 22, 2003 at 11:59pm In this assignment, you will add an interactive preview of the scene and solid
More informationRecall: Indexing into Cube Map
Recall: Indexing into Cube Map Compute R = 2(N V)NV Object at origin Use largest magnitude component of R to determine face of cube Other 2 components give texture coordinates V R Cube Map Layout Example
More informationCS GAME PROGRAMMING Question bank
CS6006  GAME PROGRAMMING Question bank Part A Unit I 1. List the different types of coordinate systems. 2. What is ray tracing? Mention some applications of ray tracing. 3. Discuss the stages involved
More information