Game FX 1
Introduction to Game FX (1/2) Improve the visual & sound game effects Includes : Combat FX Environment FX Character FX Scene FX Sound FX Post-processing after rendering FX editor needed General 3D animation tools can not do it Key-frame system is not working FX animation is always Procedurally Related to the previous frame 3D animation tool plugins or tailored-made 2
Introduction to Game FX (2/2) Small work but large effect Since GPU has become essential hardware of games, game FXs are enhanced by shaders. Skin rendering HDR Image-based lighting Ambient Occlusion Light scattering 3
Tailor-made FX Editing Tool Example 4
Combat FX During the combat Weapon motion blur Weapon effect Skill effect After the combat Damage effect Team Icon 5
Combat FX Example 6
FX Uses Texture Animation Almost all game FXs use this trick Geometry object on which the texture animation playing Billboard Rectangle always facing to camera With texture animation Simple 3D objects 3D Plate Cylinder Sphere Box Revolving a cross section curve Texture animation with color-key Semi-transparent t textures t Alpha blending Source color added to background 7
Particle System for FXs in Combat The FXs Fire / exposure / smoke / dust Initial value + time dependency Combined with billboard FX Billboard to play the texture animation Particle system to calculate l the motion path Gravity is the major force used Emitter pattern Single emitter Area emitter Emitter on vertices 8
Environment FX Weather Fog Use particle system Rain / Snow / Wind Traditional fog From near to far Hardware standard feature Volume fog Layered fog Use vertex shader Volume lighting Polygon solution Shader solution Day & night Light Scattering 9
Character FX Fatality Case by case and need creative solutions Rendering effects on skins Environment mapping Bump map Normal map Need pixel shader Texture diffusion Simulate the sub-surface surface scattering Flexible body Fur Flexible body dynamics Real-time fur rendering 10
Simple Skin Rendering Rim Shading A trick coming from rim shading Control the width of rim effect dot(nf, i) 1 - dot(nf, i) float dot = 1 dot(nf, i); diffusecolor = smoothstep(1.0 - rim_width, 1.0, dot); Add (1 dot(n, V)) or (1 dot(n, V) 2 ) to simulate the silhouette effect 11
Skin Rendering Sub-surface Scattering Texture Diffusion Render Radiance to Texture Space Use Blurring to Simulate the Light Diffusion on Skin Reference : GPU Gems III, Chapter 14
Texture Diffusion for Skins Blend six different levels of Gaussian filters to simulate texture diffusion R G B
Scene FX Sky Box Use a very large box or dome-like model to surround the whole game scene Use textures on the box or dome as the backdrop Use multiple textures and texture coordinates animation to simulate the moving of the clouds 14
Scene FX Len s Flare Runtime calculate the position and orientation of the camera with the sun Put textures to simulate the len s flare 15
Scene FX Light Scattering Atmospheric light scattering Caused by dust, molecules, or water vapor These can cause light to be: Scattered into the line of sight (in-scattering) Scattered out of the line of sight (out-scattering) Absorbed b altogether th (absorption) Skylight and sun light Can be implemented by vertex shader 16
Scene FX Light Scattering Examples Without scattering With scattering 17
Next-Gen Game FXs Use shaders Post-processing shaders Post-processing after 3D rendering HDRI (high dynamic range imaging) Motion blur Depth of field (DOF) Bloom Human visual system simulation Image-based lighting HDR image as light source Natural lighting 18
High Dynamic Range Dynamic range 影像中最亮部份與最暗部份的比值 一般數位影像 Low dynamic range (LDR) 2 8 Relative luminance (0.0 1.0) 自然光影 High dynamic range (HDR) Physical quantity for HDR 19
Office interior Indirect light from window 1/60 th sec shutter f/5.6 aperture 0 ND filters 0dB gain 20
Outside in the shade 1/1000 th sec shutter f/5.6 aperture 0 ND filters 0dB gain 16 times the light as inside 21
Outside in the sun 1/1000 th sec shutter f/11 aperture 0 ND filters 0dB gain 64 times the light as inside 22
Straight at the sun 1/10,000000 th sec shutter f/11 aperture 13 stops ND filters 0dB gain 5,000,000 times the light as inside 23
Very dim room 1/4 th sec shutter f/1.6 aperture 0 stops ND filters 18dB gain 1/1500 th the light than inside 24
2,000,000,000 400,000 25,000 1500 1 25
HDR 攝影術 Debevec and Malik, Recovering High Dynamic Range Radiance Maps from Photographs, SIGGRAPH 97 300,000 : 1 Visualization: Greg Ward 26
Dazzle Lighting 27
Dazzle Refection Dazzling Reflection 28
Fresnel Reflection : Bright reflection off low-reflectance surfaces 29
Exposure Control 30
DOF (Depth of Field) 31
Motion Blur 32
High Dynamic Range Imaging g @ Half-life 2 Copyright Valve software Used for educational purposes only.
Real-time HDR Step 1 Render the 3D on a HDR texture High definition RGBA image LDR : RGBA 0 ~ 12 bits 0 ~ 4096 (256) 1 white 0 1 HDR : RGBA > 16 bits floating-point real number color > 1.0 可使用.hdr 或.dds 中浮點格式貼圖 white = 0 1 34
Real-time HDR Step 2 Get the bright part Bloom 35
Real-time HDR Step 3 Tone Map Last step in HDR rendering Remapping HDR buffer to visible domain (LDR) 36
Introduction to Image-based Lighting (1/3) Hollywood has been doing this since the 90 s Now possible in real time 37
Introduction to Image-based Lighting (2/3) Using images as light sources IBL Without IBL Harsh Simplistic Noticeably computer generated With IBL Increasing scene s level of realism Increasing the visual interest Key technologies HDR photography Omnidirectional i photography h Global illumination 38
Introduction to Image-based Lighting (3/3) Omni-directional photography Light coming from every direction HDR Light probe images Global illumination Simulating how light traveling From light source Reflecting between surfaces Radiosity / Ray tracing Rendering equation Approximation : Screen Space Ambient Occlusion (SSAO) / Screen Space Directional Occlusion (SSDO) Pre-computed Radiance Transfer (PRT) 39
Capture HDR Light Probe Images (1/5) Digital camera with manual controls Tripod & Head 40
Capture HDR Light Probe Images (2/5) A 100%-reflective sphere can capture more than 180o reflection 8 stainless steel Gazing Sphere costs around USD20 41
Capture HDR Light Probe Images (3/5) Requirements Must capture the full dynamic range of the light within the scene HDR photography 42
Capture HDR Light Probe Images (4/5) Assembling the Light Probe 43
Capture HDR Light Probe Images (5/5) The other solutions Must see in all direction Panoramic photography Solutions :» Tiled photograph Quicktime VR»Fish-eye lens Capturing 180 degrees in single view Cropping problem for APS-C size CCD digital camera Vignetting» Scanning panoramic cameras 44
Real-world HDR Lighting Environments Funston Beach Eucalyptus Grove Uffizi Gallery Grace Cathedral Lighting Environments from the Light Probe Image Gallery: http://www.debevec.org/probes/ 45
46
Mirrored Sphere Framing and focus Blind spots Camera & photographer inside the image Lack of region directly behind the sphere Solution : Capture another image rotating 90 degrees Use HDRShop to integrate two light probe images as one. Calibrating sphere reflectivity Common values» (r, g, b) = (0.632, 0.647, 0.652) Non-specular reflection Scratch Oxidized Polarized reflection Image reflection 47
Omni-directional Image Mapping Ideal mirrored sphere Angular map Latitude-longitude map Cube map 48
Ideal Mirrored Sphere Captured by mirrored sphere Squeeze & stretch Mapping equation : From world to image : r = sin(arccos(-d z )*0.5) 2*sqrt(D 2 2 x2 +D y2 ) (u,v) = (0.5 + rd x, 0.5 rd y ) From image to world : r = sqrt((2*u-1) 2 + (2v-1) 2 ) (θ, φ) = (atan2(2u-1,-2v+1), 2arcsin(r)) (D x, D y, D z ) = (sinφcosθ, sinφsinθ, -cosφ) 49 y-up right-hand hand space
Angular Map Popular solution in light probe Mapping equation : From world to image : r = arccos(-d z ) 2π*sqrt(D x2 +D y2 ) (u,v) = (0.5 - rd y, 0.5 + rd x ) From image to world : (θ, φ) = (atan2(-2v+1,2u-1), πsqrt((2u-1) 2 +(2v-1) 2 )) (D x, D y, D z ) = (sinφcosθ, sinφsinθ, -cosφ) 50 y-up right-hand hand space
Latitude-Longitude Longitude Map A rectangular image domain 2 : 1 aspect ratio u in [0, 2] v in [0, 1] No seams Y rotating is translating horizontally Mapping equation : From world to image : (u,v) = (1+atan2(D x,-d z )/π, arccos(d y )/π) From image to world : (θ, φ) = (π(u-1), πv) (D x, D y, D z ) = (sinφcosθ, cosφ, sinφsinθ) 51 y-up right-hand hand space
Cube Map A rectangular image domain u in [0, 3] v in [0, 4] No seams We use the D3D native cube map format. 52 y-up right-hand hand space
More Image-Based Lighting Examples Outdoor Light Probes 53
54
24 Samples per Pixel 6h, 22min 55
56
24 Samples per Pixel 6h, 22min 57
Approximate Image-based Lighting Environment mapping for mirror-like surface Reflection Ambient occlusion Non-directional shadows Diffuse environment lighting Diffuse environment map Specular environment map 58
Prepare the Diffuse/Specular Environment Map Take the 360-degree panorama HDR picture as the environment map. Use HDRShop to generate the diffuse/specular environment map HDR environment map Diffuse environment map Specular environment map 59
Diffuse/Specular Environment Lighting Here, we demo to use lat-long long map (latitude-longitude longitude map) as the environment map format. For diffuse term : Use the normal vector to hit the environment map to get the average lighting for the rendering from the image as the light source. If we have the average incoming light vector from ambient occlusion data, use it. For specular term : Use the reflection vector of the camera direction to hit specular environment map #define PI 3.1415926 float2 latlong(float3 v) { float3 v3 = normalize(v); float theta = acos(v.z); // +z is up float pi = atan2(v.x, v.y) + PI; return float2(phi, theta)*float2(0.5/pi, 1.0/PI); } 60
Why Ambient Occlusion? Shadow can't help you now!
How Ambient Occlusion? Sun is Directional Light Shadow! Sky is Hemisphere Light Ambient Occlusion
Ambient Occlusion Reference GPU Gems, Chapter 17, P. 279-292292 The technique was originally developed by Hayden Landis (2002) and colleagues in Industrial Lights & Magic (ILM) for film production. Non-real-time renderer Overview Pre-processing steps Calculating the accessibility of light to the surface Calculating the average direction of the incident light Rendering Diffuse environment lighting Popular and real-time solution Screen Space Ambient Occlusion (SSAO) Using Normal Map and Depth Map 63
AO - The Preprocessing Step (1/3) On each vertex, there are two things needed to know : Accessibility What fraction of the hemisphere above that point P is un-occluded by other parts of the model The average direction of un-occluded incident light, B N B P 64
AO - The Preprocessing Step (2/3) The accessibility value and B vector are model dependent but not lighting dependent, can be computed offline in a preprocess with a ray tracer. The basic algorithm for computing ambient occlusion quantities : For each vertex { Generate a set of rays over the hemisphere centered at the vertex norm Vector avgunoccluded = Vector(0, 0, 0); int numunoccluded = 0; For each ray { If (ray doesn t intersect anything) { avgunoccluded += ray.direction; ++numunoccluded; } } avgunoccluded = normalize(avgunoccluded); accessibility = numunoccluded / numrays; } 65
AO - The Preprocessing Step (3/3) Ray generations : Rejection sampling Sampling rate is very critical Monte Carlo sampling algorithms Better distribution solution Rejection sampling algorithm : while (TRUE) { x = RandomFloat(-1, 1); // random float between -1 & 1 y = RandomFloat(-1, 1); z = RandomFloat(-1, 1); if (x*x x + y*y y + z*z z > 1) continue; // ignore ones outside unit sphere } if (dot(vector(x, y, z), N) < 0) continue; // ignore down direction ray return rn normalize(vector(x, y, z)); 66