Bringing Hollywood to Real Time Abe Wiley 3D Artist 3-D Application Research Group
Overview > Film Pipeline Overview and compare with Games > The RhinoFX/ATI Relationship > Ruby 1 and 2 The Movies > Breakdown of Four Real Time Visual Effects > Conclusion > Ruby 1 and 2 The Real Time Experience > Q&A 2
Film Pipeline > Pre-viz > Story/storyboarding > Pre-production > Modeling > Texturing > Rigging > Layout (3D storyboard) > Animation > Final Layout / Effects > Lighting > Rendering > Compositing 3
The RhinoFX/ATI Relationship > Intro Rhino > Tools > Pipeline > Memory > Preproduction > Techniques > Effects > Engine 4
Technical Constraints Given to RhinoFX > Polygon Budget > Ruby: 80,000 > Optico: 60,000 > Ninja: 25,000 > Environment: 150,000 > Lighting Limits > 3 Dynamic lights per shot (1 shadow casting) > Lightmaps used for set 5
Technical Constraints Given to RhinoFX > Animation Limits > 35 Total blend shapes > 5 Simultaneous blend shapes > 4 Weighted bones per vertex > Number of on-screen characters limited to 4 at once 6
Ruby 1 and 2 The Movies 7
Breakdown of Four Real Time Visual Effects > Floor - Ruby 1 > Blurred Dissipated Dynamic Reflections > Skin Rendering and PRT Ruby 1 & 2 > Post effects/render effects Ruby 1 & 2 > Glow > Motion Blur > Heat Distortion > Dynamic Real Time Reflections Ruby 2 > Dynamic Cube Maps 8
Blurred Dissipated Dynamic Reflections 9
Ruby 1 Reflections: Implementation > Mirror the current perspective camera position > Calculate a frustum for the geometry that could be reflected and render it out into a buffer > For each pixel of the reflection buffer, calculate height value, (distance from Reflection plane) and fade it based on a that distance > Project that buffer back onto the floor surface from the original camera 10
Reflection Buffer 11
Final frame 12
Skin Rendering Ruby 1 Ruby 2 13
Ruby 1 Skin: Implementation Light in Texture Space Blur Geometry Sample texture space light Back Buffer 14
Standard Lighting Model 15
Blurred Lighting Model 16
Spatially Varying Blur > Used to simulate the subsurface component of skin lighting > Used a growable Poisson disc filter > Read the kernel size from a texture > Allows varying the subsurface effect > Higher for places like ears/nose > Lower for places like cheeks 17
Blur Size Map Blur Kernel Size Result Texture Space Lighting 18
Shadows > Use shadow maps > Apply shadows during texture lighting > Get free blur > Soft shadows > Simulates subsurface interaction > Lower precision/size requirements > Reduces artifacts > Only doing shadows from one key light 19
Texture Lighting with Shadows Write distance from light into shadow map Geometry Light in Texture Space Blur Sample texture space light Back Buffer 20
Shadowed Lit Texture Shadow Map (depth) Shadows in Texture Space 21
Results with Shadows 22
Pre-computed Radiance Transfer (PRT) 23
Global Illumination > Non-local lighting > Area light sources > Shadows > Interreflections > Subsurface scattering > Raytracing, Radiosity, etc. > These are not real-time friendly 24
Real Time Global Illumination > Preprocessor computes diffuse radiance and compresses it > Run-time engine compresses Irradiance (I) the same way > Fast calculation to get value of light at any point 25
PRT: Pros and Cons > Good for solid objects moving within a space > Not so good for skinned simulation as inner object occlusion changes > Multiple spherical harmonics can be recorded and LERPed to get more accurate results for skinned meshes > Transfer precompute can include partial transfer information for skin and other subsurface scattering 26
PRT on Ruby 2 Skin 27
Glows 28
Glows: Implementation > Render Shaded into frame buffer > Render glow intensity to alpha channel of the frame buffer > Frame buffer is copied to an off screen surface and downsampled to ¼ the resolution > This is then copied into 2 lower resolution ping/pong buffers > Blur repeatedly (ping/pong) based on alpha buffer intensity black, no blur, white max blur > Multiply shaded pass by last blurred buffer 29
Glow Buffers 30
Final frame 31
Motion Blur 32
Why Motion Blur? 33
Motion Blur: Implementation > Render the tunnel geometry out to the back buffer > Render velocity vector into a buffer image r-x, g-y,b-distance (screen space) > This is done by taking n samples of a screen pixel from previous frame (calculated using vector) to current frame > Add all n pixels together and divide by n to get value > The quality of this effect is tunable by the number of samples > The rendered tunnel is then blurred based on this motion vector > Comp characters back onto the new motion blurred tunnel 34
Motion Vectors R = screen x G = screen y B = distance 35
Non-blurred image 36
Motion Vectors 37
Heat Distortion 38
Heat Distortion: Implementation > Quad Shader > Take two pre-generated noise maps and animate > One Scaling and one scrolling > Combine them (multiply values) > Map to full screen quad > Use their normals to look up a bent normal from the backbuffered non-post render 39
Heat Distortion x = Fetch value from noise map to calculate offset Final Image 40
Dynamic Real Time Reflections 41
Traditional Static Cube Map > Blurred Environment Map. > Good for Interior spaces 42
Dynamic Cube Map: Implementation > Create low resolution version of tunnel with lightmaps from high resolution render > Render 6 small projections of the low res. geometry from Ruby s helmet POV (in cube formation) into offscreen buffers > Convert to cube map and use to dynamically lookup info for reflection map > Explosions are also rendered into the cubemap > they re cards - not much overhead 43
Wireframe of Low Polygon Hallway 44
Dynamic Reflections: Final Frame 45
Conclusion > Film and Game Convergence is happening. > Keys to Successful Outsourcing > No longer Special Features > Go back to film/cg early days for technique inspirations > Prioritize Visual Impact 46
Ruby 1 and 2 The Real Time Experience 47