Rendering Scena 3D rendering Immagine screen buffer ( array 2D di pixel ) Rendering in games Real-time (20 or) 30 or 60 FPS Algorithm: rasterization based rendering Hardware based pipelined architecture, parallel Rendering primitives: mostly triangles (lines and points possible too) Complexity: Linear with number of primitives Marco Tarini 1
Rendering: rasterization of trianges x v0 =( x0, y0, z0 ) y v1 =( x1, y1, z1 ) v2 =( x2, y2, z2 ) z GPU pipeline (shown: OpenGL 2.0) Marco Tarini 2
GPU pipeline simplified GPU pipeline simplified more Marco Tarini 3
GPU pipeline simplified even more y v0 v1 transform v0 v1 rasterizer fragment process pixels finali z v2 vertici 3D x v2 triangolo 2D a schermo (2D screen triangle) "frammenti" (fragments) 10 Rasterization based rendering: stages Per vertex: (vertex shader) skinning (from rest pose to current pose) transform (from object space to screen space) Per triangle: (rasterizer) rasterization interpolation of per-vertex data Per fragment: (fragment shader) lighting (from normal + lights + material to RGB) texturing alpha kill Per fragment: (output combiners) depth test alpha blend Marco Tarini 4
Rasterization-Based Rendering y v0 v1 per vertex v0 v1 per triangle per fragment final pixels z v2 3D vertices x v2 2D triangle on screen "fragments" PROGRAMMABLE! 12 Rasterization-Based Rendering y v0 v1 per vertex v0 v1 per triangle per fragment final pixels z v2 3D vertices x v2 2D triangle on screen "fragments" A user-defined "Vertex Shader" (or vertex program) 13 A user-defined "Fragment Shader" (or pixel program) Marco Tarini 5
Shading languages High level: GLSL - OpenGL Shading Language (by Khronos) HLSL - High Level Shader Language (Direct3D, by Microsoft) CG - C for Graphics (by Nvidia) Low lever: ARB Shader Program (an assembler for GPU -- deprecated) In Unity (and, similarly, in many game engines) Meshes have a mesh renderer component includes several flags and settings and Mesh renderer have a material component include flags, material parameters settings, textures and Material in include a shader determines which settings/texture are available in material can be one of the many standard shader can be a customized shader: use shader-lab Marco Tarini 6
In Unity: ShaderLab A text file defining shaders and describing how the engine should use them Defines A set of shaders to link (vertex, fragment ) in CG language Fallback shaders (a plan B for when the running HW does not support the default shader) Connection of material parameters / textures (visible to scripts / Unity GUI) to shaders uniforms (basically, global constant usable in shaders) Rendering effects: lighting Marco Tarini 7
Local lighting LIGHT reflection (BRDF) EYE OBJECT Lighting Material parameters (data modelling the «material») Illuminant (data modelling lighting environment) Geometric data (e.g. normal, tangent dirs, pos viewer) LIGHTING MODEL ( the lighting equation ) final R, G, B Marco Tarini 8
Lighting equations Many different equations Lambertian Blinn-Phong Beckmann Heidrich Seidel Cook Torrance Ward (anisotropic) add Fresnel effects Varying levels of complexity realism simplest, most commonly used (some are physically based, some are just tricks) material parameters allowed richness of effects to learn more, see Computer Graphics course! Lighting equations: most basic solutions Diffuse (aka Lambertian) physically based only dull materials only material parameter: base color (aka albedo, aka diffuse color) Specular (aka Blinn-Phong) just a trick add simulated reflections (highlights) additional material parameters: specular intensity (or, color) specular exponent (aka glossiness) Marco Tarini 9
Lighting: per-pixel VS per-vertex Per pixel = more quality computation in the fragment shader interpolate lighting input material params can be in textures (more variations) Per vertex = more efficiency compiutation in the vertex shader interpolate lighting output material params must be in vertices (few variations) Usually: mixed! partly per vertex (e.g. diffuse, local light dir computation) partly per pixel (e.g. specular, env. map, shadow-map) Many effects require per-pixel: normal mapping parallax mapping Lighting Material parameters (data modelling the «material») Illuminant (data modelling lighting environment) Geometric data (e.g. normal, tangent dirs, pos of viewer) LIGHTING MODEL ( the lighting equation ) final R, G, B Marco Tarini 10
Illumination environments: discrete a finite set of light suroces few of them (usually 1-4) each sitting in a node of the scene graphs types: point light sources with: position spot-lights with: position, orientation, wideness (angle) directional light sources with: orientation extra attributes: color / intesity (other minor attributes) Illumination environments: densely sampled From each direction (on the sphere) a light intensity / color Asset to store that: Environment map φ (or Reflection Probe ) 90-180 180 θ -90 Marco Tarini 11
Typical issue with lights in games: too many of them Each light has a cost: compute a term in the Lighting Equation (for each vertex or fragment!) access all its parameters in the shaders maybe: compute its shadows (!!!) 1..4 lights: ok 20+ lights: not ok. But, potentially needed? physically speaking, a light source has infinite range of effect Typical issue with lights in games: too many of them Solution: light proxies full quality for the (e.g.) 4 most relevant lights how to pick them? (per object) the closest ones the brightest ones the dynamic ones (as opposed to static) for them: shadows, full per-pixel approximate the others lights no shadows, per vertex aggregate them in Env map / light probes populate the scene with ad-hoc light probes just ignore the least relevant ones artificially finite radius of lights Marco Tarini 12
Spherical functions (a common requirement in lighting) Task: how to store a function f: Ω R n Ω = surface of a sphere i.e. the set of all unit vectors (directions) R n = some vector space (scalars, colors, vectors ) Examples: a (local) lighting environment (at a position p) f( x ) = how much light comes in p from direction x the lighting radiance (of a point p) f( x ) = how much light p reflects toward direction x local occlusions (for a point p) f( x ) = is p seen from direction x? in [0, 1] We want efficient storage, synthesis, computation of f, + ability to interpolate them Spherical function: by sampling Idea: just sample f (i.e. store it as a table) Step 1: parametrize a sphere into domain A use a fixed function m: Ω A A = typically, a rectangle m must be fast, and not distorted common choices for m? Step 2: regularly sample A (as an image) Then: Store f : just store the image To get f( x ): access A at position m( x ) (use bilinear interpolation or better) To interpolate between two f: just cross-fade the two images Marco Tarini 13
Spherical function: with Spherical Harmonics Local lighting Material parameters (data modelling the «material») Illuminat (data modelling lighting environment) Geometric data (e.g. normal, tangent dirs, pos of viewer) LIGHTING MODEL ( the lighting equation ) final R, G, B Marco Tarini 14
Material parameters GPU rendering of a Mesh in a nutshell (reminder) Load store all data on GPU RAM Geometry + Attributes Connectivity Textures Shaders Material Parameters Rendering Settings and Fire! send the command: do it! THE MESH ASSET THE MATERIAL ASSET Marco Tarini 15
Terminology Material parameters parameters modelling the optical behavior of physical object (part of) the input of the lighting equation Material asset an abstraction used by game engines consisting of a set of textures (e.g. diffuse + specular + normal map) a set of shaders (e.g. vertex + fragment) a set of global parameters (e.g. global glossiness) rendering settings (e.g. back face culling Y/N?) corresponds a state of the rendering engines Authoring material parameters Q: which materials parameters needs be defined? A: depends on chosen lighting equation Idea: game engine lets material artist choose intuitively named material parameters, then picks a lighting equation accordingly the one best suiting them speak material-artist language Marco Tarini 16
Authoring material parameters Popular choice of intuitive parameters : Base color (rgb) Specularity (scalar) Metal-ness (scalar) Roughness (scalar) images: unreal engine 4 PBM Physically Based Materials Basically, just a buzzword Meanings: 1. use accurate material parameters physically plausible maybe measured instead of: made up and tuned by intuition (by the material artist) 2. keep each lighting element separated (e.g. its own texture) use fewer shortcuts that usual e.g. use: base color: one texture baked AO: another texture ( Geometry Term ) instead of: base color x baked AO : one texture AO = Ambient Occlusion (see later) Marco Tarini 17
PMS Physically Based Shading Basically, just another buzzword Meanings: 1. Use PBM 2. Use a more complex, more adherent to reality Lighting equation E.g. include HDR (and Gamma-corrected rendering) include Fresnel effects energy conserving Lighting equations only General objective: Make a material look plausible under a larger range of lighting environments (much more challenging than targeting just one or a few!) Local lighting in brief Material properties (data modelling the «material») Illuminat (data modelling lighting environment) Geometric data (e.g. normal, tangent dirs, pos viewer) LOCAL LIGHTING ( the lighting equation ) final R, G, B Marco Tarini 18
Reminder: normals Per vertex attribute of meshes Reminder: Tangent dirs normal mapping (tangent space) requires tangent dirs «anisotropic» BRDF: requires tantent dir Marco Tarini 19
Material qualities: it s improving fast indie 2006 indie 2010 Material qualities: improving Marco Tarini 20
Local lighting in brief Material properties (data modelling the «material») Illuminat (data modelling lighting environment) Geometric data (e.g. normal, tangent dirs, pos viewer) LOCAL LIGHTING ( the lighting equation ) final R, G, B Lighting equation: how Computed in the fragment shader most game engine support a subset as default ones any custom one can be programmed in shaders! Material + geometry parameters stored : in textures (highest freq variations) in vertex attributes (smooth variations) as material assets parameter (no variation) for example, where are diffuse color specular color normals tangent dirs typically stored? Marco Tarini 21
How to feed parameters to the lighting equation Hard wired choice of the game engine WYSIWYG game tools E.g. in Unreal Engine 4 Multi-pass rendering Basic mechanism Pass 1: the resulting screen-buffer is stored in a texture (not sent on the screen) Pass 2: the final rendering uses the screen-buffer as a texture The buffer is write only in pass 1 read only in pass 2 The two passes be completely different different settings, points of view, resolution Sometimes: more passes than 2 Sometimes: pass 1 produces more than 1 buffer in parallel Marco Tarini 22
Multi-pass rendering Examples Many custom effects like: Mirrors: Pass 1: produces what is seen in a mirror Pass 2: the mirror surface is textured with it An animated painting (think harry potter): Pass 1: produces the painting content Pass 2: in the main scene, the painting is textured with it Portals in Portals serie (Valve) We will see a few standard effects requiring Multi-pass rendering (such as: shadow-maps) One sub-class of multi-pass rendering is Screen space effects Screen-space effect Basic mechanism Pass 1: the scene is rendered From the main camera point of view Produces: a RGB buffer Produces: a depth buffer sometimes, other buffers too ( multiple render targets ) Pass 2: one big quad is rendered, covering the screen exactly uses the produced buffer(s) as texture(s) adding all kinds of effects (e.g.: blur?) Basically, it s post-production in real time Marco Tarini 23
Rendering techniques popular in games Shadowing shadow mapping Screen Space Ambient Occlusion Camera lens effects Flares limited Depth Of Field Motion blur High Dynamic Range Non Photorealistic Rendering contours toon BRDF Texture-for-geometry Bumpmapping Parallax mapping con PCF SSAO DoF HDR NPR Shadow mapping Marco Tarini 24
Shadow mapping Shadow mapping in a nutshell Two passes. 1st rendering: camera in light position produces: depth buffer called: the shadowmap 2nd renedring: camera in final position for each fragment access the shadowmap once to determine if fragment is reached by light or not Marco Tarini 25
Shadow mapping in a nutshell LUCE OCCHIO SHADOW MAP final SCREEN BUFFER Shadow Mapping: costs Rendering the shadowmap: can be kept minimal! no color buffer writing (no lighting, texturing ) just: vertex transform, and depth test optimizations: view-frustum culling still, it s a costly extra pass (for each light ) do only for important lights can be baked once and reused, for static objects (yet another good reason to tag them) requires static lights too Marco Tarini 26
Shadow Mapping: issues Shadow-map bit-depth: quantization artifacts matters! 16bit is hardly enough Shadow-map resolution: aliasing artifacts matters! remedies: higer res, PCF, multi-res shadow-map Screen Space AO Marco Tarini 27
Video Game Dev - Univ Verona 2017 15/10/2017 Screen Space AO OFF Screen Space AO ON Marco Tarini 28
Screen Space AO in a nutshell First pass: standard rendering produces: rgb image produces: depth image Second pass: screen space technique for each pixel, look at depth VS its neighbors: neighbors are in front? difficult to reach pixel: partly negate ambient light neighbors are behind? pixel exposed to ambient light: more ambient light (limited) Depth of Field depth out of focus range: blurred depth in focus range: sharp Marco Tarini 29
(limited) Depth of Field in a nutshell First pass: standard rendering rgb image depth image Second pass: screen space technique: pixel is inside of focus range? keep it sharp pixel is outside of focus range? blur it (blur = average with neighbors pixels kernel size ~= amount of blur) HDR - High Dynamic Range (limited Dynamic Range) Marco Tarini 30
HDR - High Dynamic Range in a nutshell First pass: normal rendering, BUT use lighting / materials with HDR pixel values not in [0..1] e.g. sun emits light with = RGB [500,500,500]: >1 = over-exposed! whiter than white Second pass: screen space technique: >1 values bleed into neighbors i.e.: overexposed pixels lighten neighbors i.e.: they will be max white (1,1,1), and their light bleeds into neighbors Parallax Mapping Normal map only Marco Tarini 31
Parallax Mapping Normal map + Parallax map Parallax mapping: in a nutshell Texture-for-geometry technique like a normal-maps (used in conjunction to it) Requires a displacement map: texel = distance from surface Marco Tarini 32
Motion Blur NPR rendering / Toon shading / Cel Shading Marco Tarini 33
NPR rendering: Toon shading / Cel Shading Toon shading / Cel Shading in a nutshell Simulating toons Typically, two effects: add contour lines at discontinuity lines of: 1. depth, 2. normals, 3. materials quantize lighting: e.g. 2 or 3 tones: light, medium, dark instead of continuous interval it s a simple variation of lighting equation Marco Tarini 34
NPR rendering: simulated pixel art img by Howard Day (2015) NPR rendering: simulated pixel art img by Dukope Marco Tarini 35
NPR rendering: simulated pixel art img by Dukope Multi-pass rendering in Unity (notes) Very simple to do (as usual) Steps: create a Render Texture a 2D GPU buffer which can be: the output of a rendering, OR the texture of another rendering create one (secondary) Camera add the Render Texture as the Target Texture of this Camera that camera won t output its rendering to the screen add the Render Texture as a Texture, in some material What happens: every frame, Unity will: 1 st pass: render the texture (from the secondary camera) 2 nd pass: use the result in the (final) rendering (from the main camera) Marco Tarini 36
Screen Space effects in Unity (notes) Very simple to do (as usual) Steps: Create a Shader (ShaderLab) pick a image effect shader (just to save initialization work) Create a Material, which uses the shader Add a Script to the main camera add a public Material field to it assign it to the new material (from the GUI) redefine its OnRenderImage method make it just do one blit operation (see next slide) using the material as parameters All ready: the effect can now be coded in the Fragment shader of the Shader (multiple?) accesses the texture(s), computation of final RGB Screen Space effects in Unity (notes) using System.Collections; using System.Collections.Generic; using UnityEngine; public class CameraScript : MonoBehaviour { void Update () { } public Material mat; } void OnRenderImage ( RenderTexture src, RenderTexture dest ) { Graphics.Blit (src, dest, mat); } blit = 2D screen-buffer copy In Unity, implemented as a full-screen quad rendering Marco Tarini 37