Further Graphics. Advanced Shader Techniques. Alex Benton, University of Cambridge Supported in part by Google UK, Ltd

Similar documents
Advanced Graphics. OpenGL and Shaders II. Alex Benton, University of Cambridge Supported in part by Google UK, Ltd

Advanced Graphics. The Shader knows. Alex Benton, University of Cambridge Supported in part by Google UK, Ltd

CMPS160 Shader-based OpenGL Programming. All slides originally from Prabath Gunawardane, et al. unless noted otherwise

CMPS160 Shader-based OpenGL Programming

Lets assume each object has a defined colour. Hence our illumination model is looks unrealistic.

Technical Game Development II. Reference: Rost, OpenGL Shading Language, 2nd Ed., AW, 2006 The Orange Book Also take CS 4731 Computer Graphics

Technical Game Development II. Reference: Rost, OpenGL Shading Language, 2nd Ed., AW, 2006 The Orange Book Also take CS 4731 Computer Graphics

Lecture 17: Shading in OpenGL. CITS3003 Graphics & Animation

The Rasterization Pipeline

Objectives Shading in OpenGL. Front and Back Faces. OpenGL shading. Introduce the OpenGL shading methods. Discuss polygonal shading

3D Rasterization II COS 426

Technical Game Development II. Reference: Rost, OpenGL Shading Language, 2nd Ed., AW, 2006 The Orange Book IMGD 4000 (D 10) 1

CS 418: Interactive Computer Graphics. Basic Shading in WebGL. Eric Shaffer

C P S C 314 S H A D E R S, O P E N G L, & J S RENDERING PIPELINE. Mikhail Bessmeltsev

EDAF80 Introduction to Computer Graphics. Seminar 3. Shaders. Michael Doggett. Slides by Carl Johan Gribel,

Pipeline Operations. CS 4620 Lecture Steve Marschner. Cornell CS4620 Spring 2018 Lecture 11

CSE 167: Introduction to Computer Graphics Lecture #6: Lights. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2014

Computer Graphics (CS 543) Lecture 8a: Per-Vertex lighting, Shading and Per-Fragment lighting

Pipeline Operations. CS 4620 Lecture 14

Computergrafik. Matthias Zwicker Universität Bern Herbst 2016

Lighting and Shading II. Angel and Shreiner: Interactive Computer Graphics 7E Addison-Wesley 2015

Why modern versions of OpenGL should be used Some useful API commands and extensions

Deferred Rendering Due: Wednesday November 15 at 10pm

CS452/552; EE465/505. Lighting & Shading

Lessons Learned from HW4. Shading. Objectives. Why we need shading. Shading. Scattering

OpenGl Pipeline. triangles, lines, points, images. Per-vertex ops. Primitive assembly. Texturing. Rasterization. Per-fragment ops.

OUTLINE. Tessellation in OpenGL Tessellation of Bezier Surfaces Tessellation for Terrain/Height Maps Level of Detail

Shaders. Slide credit to Prof. Zwicker


CMSC427 Advanced shading getting global illumination by local methods. Credit: slides Prof. Zwicker

CGT520 Lighting. Lighting. T-vertices. Normal vector. Color of an object can be specified 1) Explicitly as a color buffer

Sign up for crits! Announcments

Mattan Erez. The University of Texas at Austin

TSBK03 Screen-Space Ambient Occlusion

The Rasterization Pipeline

Orthogonal Projection Matrices. Angel and Shreiner: Interactive Computer Graphics 7E Addison-Wesley 2015

Computergrafik. Matthias Zwicker. Herbst 2010

CHAPTER 1 Graphics Systems and Models 3

Computer Graphics Coursework 1

Computer Graphics I Lecture 11

Shading Shades. Frank Jargstorff. June 1, Abstract

Computer Graphics with OpenGL ES (J. Han) Chapter 6 Fragment shader

GLSL Introduction. Fu-Chung Huang. Thanks for materials from many other people

OpenGL shaders and programming models that provide object persistence

Shading. Introduction to Computer Graphics Torsten Möller. Machiraju/Zhang/Möller/Fuhrmann

Illumination and Shading

Ray Tracing: Image Quality and Texture

Scan line algorithm. Jacobs University Visualization and Computer Graphics Lab : Graphics and Visualization 272

CS230 : Computer Graphics Lecture 4. Tamar Shinar Computer Science & Engineering UC Riverside

Pipeline Operations. CS 4620 Lecture 10

Chapter 7 - Light, Materials, Appearance

CS 4620 Program 3: Pipeline

OPENGL RENDERING PIPELINE

graphics pipeline computer graphics graphics pipeline 2009 fabio pellacini 1

CSE 167: Lecture #8: GLSL. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2012

graphics pipeline computer graphics graphics pipeline 2009 fabio pellacini 1

A method in creating 3D models: From shape to shape, from shapes to model (October 2016)

Rendering. Converting a 3D scene to a 2D image. Camera. Light. Rendering. View Plane

Computer Graphics (CS 4731) Lecture 18: Lighting, Shading and Materials (Part 3)

The Graphics Pipeline and OpenGL III: OpenGL Shading Language (GLSL 1.10)!

Fog example. Fog is atmospheric effect. Better realism, helps determine distances

GUERRILLA DEVELOP CONFERENCE JULY 07 BRIGHTON

CSE 167: Introduction to Computer Graphics Lecture #7: Lights. Jürgen P. Schulze, Ph.D. University of California, San Diego Spring Quarter 2015

Texture Mapping. Michael Kazhdan ( /467) HB Ch. 14.8,14.9 FvDFH Ch. 16.3, , 16.6

Introduction to Computer Graphics with WebGL

CPSC 314 LIGHTING AND SHADING

CS 464 Review. Review of Computer Graphics for Final Exam

Graphics Hardware and Display Devices

Today. Texture mapping in OpenGL. Texture mapping. Basic shaders for texturing. Today. Computergrafik

Programming Graphics Hardware

Lab 9 - Metal and Glass

CS452/552; EE465/505. Intro to Lighting

28 SAMPLING. ALIASING AND ANTI-ALIASING

Rendering Grass with Instancing in DirectX* 10

Texture Mapping. CS 537 Interactive Computer Graphics Prof. David E. Breen Department of Computer Science

C O M P U T E R G R A P H I C S. Computer Graphics. Three-Dimensional Graphics V. Guoying Zhao 1 / 65

Topic 12: Texture Mapping. Motivation Sources of texture Texture coordinates Bump mapping, mip-mapping & env mapping

HW-Tessellation Basics

TSBK 07! Computer Graphics! Ingemar Ragnemalm, ISY

Today. Global illumination. Shading. Interactive applications. Rendering pipeline. Computergrafik. Shading Introduction Local shading models

ECS 175 COMPUTER GRAPHICS. Ken Joy.! Winter 2014

Zeyang Li Carnegie Mellon University

Topic 11: Texture Mapping 11/13/2017. Texture sources: Solid textures. Texture sources: Synthesized

CS 4600 Fall Utah School of Computing

CS452/552; EE465/505. Texture Mapping in WebGL

CS GPU and GPGPU Programming Lecture 2: Introduction; GPU Architecture 1. Markus Hadwiger, KAUST

Render all data necessary into textures Process textures to calculate final image

Shading , Fall 2004 Nancy Pollard Mark Tomczak

Lecture 15: Shading-I. CITS3003 Graphics & Animation

DEFERRED RENDERING STEFAN MÜLLER ARISONA, ETH ZURICH SMA/

GLSL Introduction. Fu-Chung Huang. Thanks for materials from many other people

Computer Graphics. Lecture 02 Graphics Pipeline. Edirlei Soares de Lima.

INFOGR Computer Graphics

Topic 11: Texture Mapping 10/21/2015. Photographs. Solid textures. Procedural

Announcements. Written Assignment 2 out (due March 8) Computer Graphics

Introduction to the OpenGL Shading Language

Computergrafik. Matthias Zwicker Universität Bern Herbst 2016

Topics and things to know about them:

CS GPU and GPGPU Programming Lecture 7: Shading and Compute APIs 1. Markus Hadwiger, KAUST

CSE 167: Introduction to Computer Graphics Lecture #8: Textures. Jürgen P. Schulze, Ph.D. University of California, San Diego Spring Quarter 2016

Transcription:

Further Graphics Advanced Shader Techniques 1 Alex Benton, University of Cambridge alex@bentonian.com Supported in part by Google UK, Ltd

Lighting and Shading (a quick refresher) Recall the classic lighting equation: I = ka + kd(n L) + ks(e R)n where ka is the ambient lighting coefficient of the object or scene kd(n L) is the diffuse component of surface illumination ( matte ) ks(e R)n is the specular component of surface illumination ( shiny ) where R = L - 2(L N)N We compute color by vertex or by polygon fragment: Color at the vertex: Gouraud shading Color at the polygon fragment: Phong shading Vertex shader outputs are interpolated across fragments, so code is clean whether we re interpolating colors or normals. 2

Shading with shaders For each vertex our Java code will need to provide: Vertex position Vertex normal [Optional] Vertex color, ka / kd / ks, reflectance, transparency We also need global state: Camera position and orientation, represented as a transform Object position and orientation, to modify the vertex positions above A list of light positions, ideally in world coordinates 3

Shader sample Gouraud shading #version 330 uniform uniform uniform uniform mat4 mat4 mat3 vec3 #version 330 modeltoscreen; modeltoworld; normaltoworld; lightposition; in vec4 v; in vec3 n; in vec4 color; out vec4 fragmentcolor; void main() { fragmentcolor = color; } out vec4 color; const vec3 purple = vec3(0.2, 0.6, 0.8); void main() { vec3 p = (modeltoworld * v).xyz; vec3 n = normalize(normaltoworld * n); vec3 l = normalize(lightposition - p); float ambient = 0.2; float diffuse = 0.8 * clamp(0, dot(n, l), 1); color = vec4(purple * (ambient + diffuse), 1.0); gl_position = modeltoscreen * v; Diffuse lighting d = kd(n L) expressed as a shader } 4

Shader sample Phong shading #version 330 #version 330 uniform mat4 modeltoscreen; uniform mat4 modeltoworld; uniform mat3 normaltoworld; uniform vec3 eyeposition; uniform vec3 lightposition; in vec4 v; in vec3 n; in vec3 position; in vec3 normal; out vec4 fragmentcolor; out vec3 position; out vec3 normal; void main() { normal = normalize( normaltoworld * n); position = (modeltoworld * v).xyz; gl_position = modeltoscreen * v; } a = ka d = kd(n L) s = ks(e R)n const vec3 purple = vec3(0.2, 0.6, 0.8); void main() { vec3 n = normalize(normal); vec3 l = normalize(lightposition - position); vec3 e = normalize(position - eyeposition); vec3 r = reflect(l, n); float ambient = 0.2; float diffuse = 0.4 * clamp(0, dot(n, l), 1); float specular = 0.4 * pow(clamp(0, dot(e, r), 1), 2); GLSL includes handy helper methods for illumination such as reflect()--perfect for specular highlights. fragmentcolor = vec4(purple * (ambient + diffuse + specular), 1.0); } 5

Shader sample Gooch shading Gooch shading is an example of non-realistic rendering. It was designed by Amy and Bruce Gooch to replace photorealistic lighting with a lighting model that highlights structural and contextual data. They use the term of the conventional lighting equation to choose a map between cool and warm colors. This is in contrast to conventional illumination where lighting simply scales the underlying surface color. Combined with edge-highlighting through a second renderer pass, this creates 3D models which look like engineering schematics. Image source: A Non-Photorealistic Lighting Model For Automatic Technical Illustration, Gooch, Gooch, Shirley and Cohen (1998). Compare the Gooch shader, above, to the Phong shader (right). 6

Shader sample Gooch shading #version 330 // Original author: Randi Rost // Copyright (c) 2002-2005 3Dlabs Inc. Ltd. uniform vec3 vcolor; #version 330 // Original author: Randi Rost // Copyright (c) 2002-2005 3Dlabs Inc. Ltd. uniform mat4 modeltocamera; uniform mat4 modeltoscreen; uniform mat3 normaltocamera; float DiffuseCool = 0.3; float DiffuseWarm = 0.3; vec3 Cool = vec3(0, 0, 0.6); vec3 Warm = vec3(0.6, 0, 0); in float NdotL; in vec3 ReflectVec; in vec3 ViewVec; vec3 LightPosition = vec3(0, 10, 4); out vec4 result; in vec4 vposition; in vec3 vnormal; out float NdotL; out vec3 ReflectVec; out vec3 ViewVec; void main() { vec3 ecpos vec3 tnorm vec3 lightvec ReflectVec ViewVec NdotL gl_position } = = = = = = = void main() { vec3 kcool = min(cool + DiffuseCool * vcolor, 1.0); vec3 kwarm = min(warm + DiffuseWarm * vcolor, 1.0); vec3 kfinal = mix(kcool, kwarm, NdotL); vec3 nrefl vec3 nview float spec vec3(modeltocamera * vposition); normalize(normaltocamera * vnormal); normalize(lightposition - ecpos); normalize(reflect(-lightvec, tnorm)); normalize(-ecpos); (dot(lightvec, tnorm) + 1.0) * 0.5; modeltoscreen * vposition; = normalize(reflectvec); = normalize(viewvec); = pow(max(dot(nrefl, nview), 0.0), 32.0); if (gl_frontfacing) { result = vec4(min(kfinal + spec, 1.0), 1.0); } else { result = vec4(0, 0, 0, 1); } } 7

Shader sample Gooch shading In the vertex shader source, notice the use of the built-in ability to distinguish front faces from back faces: if (gl_frontfacing) {... This supports distinguishing front faces (which should be shaded smoothly) from the edges of back faces (which will be drawn in heavy black.) In the fragment shader source, this is used to choose the weighted color by clipping with the a component: vec3 kfinal = mix(kcool, kwarm, NdotL); Here mix() is a GLSL method which returns the linear interpolation between kcool and kwarm. The weighting factor is NdotL, the lighting value. 8

Shader sample Gooch shading 9

Texture mapping Real-life objects rarely consist of perfectly smooth, uniformly colored surfaces. Texture mapping is the art of applying an image to a surface, like a decal. Coordinates on the surface are mapped to coordinates in the texture. 10

Procedural texture Instead of relying on discrete pixels, you can get infinitely more precise results with procedurally generated textures. Procedural textures compute the color directly from the U,V coordinate without an image lookup. For example, here s the code for the torus brick pattern (right): tx = (int) 10 * u I ve cheated slightly and multiplied the u coordinate by 4 to repeat the brick texture four times around the torus. ty = (int) 10 * v oddity = (tx & 0x01) == (ty & 0x01) edge = ((10 * u - tx < 0.1) && oddity) (10 * v - ty < 0.1) return edge? WHITE : RED 11

Non-color textures: normal mapping Normal mapping applies the principles of texture mapping to the surface normal instead of surface color. The specular and diffuse shading of the surface varies with the normals in a dent on the surface. If we duplicate the normals, we don t have to duplicate the dent. In a sense, the renderer computes a trompe-l oeuil image on the fly and paints the surface with more detail than is actually present in the geometry. 12

Non-color textures: normal mapping 13

Procedural texturing in the fragment shader //... const //... vec3 CENTER = vec3(0, 0, 1); //const... //... vec3 CENTER = vec3(0, 0, 1); //const... vec3 CENTER = vec3(0, 0, 1); voidconst main() vec3 { LEFT_EYE = vec3(-0.2, 0.25, 0); const vec3{right_eye = vec3(0.2, 0.25, 0); bool void isoutsideface main() = bool //(length(position... isoutsideface = CENTER) > 1); vec3 color (length(position = isoutsideface - CENTER)? BLACK > : 1); YELLOW; void ismouth main() {= bool fragmentcolor bool (length(position isoutsideface = vec4(color, = CENTER) (length(position 1.0);< 0.75) - CENTER) > } 1); && (position.y <= -0.1); bool iseye = (length(position - LEFT_EYE) < 0.1) vec3 color (length(position = (ismouth isoutsideface) - RIGHT_EYE) < 0.1); bool? BLACK ismouth : YELLOW; = (length(position - CENTER) < 0.75) && (position.y <= -0.1); fragmentcolor = vec4(color, 1.0); } vec3 color = (ismouth iseye isoutsideface)? BLACK : YELLOW; fragmentcolor = vec4(color, 1.0); } (Code truncated for brevity--again, check out the source on github for how I did the curved mouth and oval eyes.) 14

Advanced surface effects Ray-tracing, ray-marching! Specular highlights Non-photorealistic illumination Volumetric textures Bump-mapping Interactive surface effects Ray-casting in the shader Higher-order math in the shader...much, much more! 15

Antialiasing with OpenGL Antialiasing remains a challenge with hardware-rendered graphics, but image quality can be significantly improved through GPU hardware. The simplest form of hardware anti-aliasing is Multi-Sample Anti-Aliasing (MSAA). Render everything at higher resolution, then down-sample the image to blur jaggies Enable MSAA in OpenGL with glfwwindowhint(glfw_samples, 4); 16

Antialiasing with OpenGL: MSAA Non-anti-aliased (left) vs 4x supersampled (right) polygon edge, using OpenGL s built-in supersampling support. Images magnified 4x. 17

Antialiasing on the GPU MSAA suffers from high memory constraints, and can be very limiting in high-resolution scenarios (high demand for time and texture access bandwidth.) Eric Chan at MIT described an optimized hardware-based anti-aliasing method in 2004: 1. Draw the scene normally 2. Draw wide lines at the objects' silhouettes a. Use blurring filters and precomputed luminance tables to blur the lines width 3. Composite the filtered lines into the framebuffer using alpha blending This approach is great for polygonal models, tougher for effects-heavy visual scenes like video games 18

Antialiasing on the GPU More recently, NVIDIA s Fast Approximate Anti-Aliasing ( FXAA ) has become popular because it optimizes MSAA s limitations. Abstract: 1. 2. 3. 4. 5. 6. Use local contrast (pixel-vs-pixel) to find edges (red), especially those subject to aliasing. Map these to horizontal (gold) or vertical (blue) edges. Given edge orientation, the highest contrast pixel pair 90 degrees to the edge is selected (blue/green) Identify edge ends (red/blue) Re-sample at higher resolution along identified edges, using sub-pixel offsets of edge orientations Apply a slight blurring filter based on amount of detected sub-pixel aliasing Image from https://developer.download.nvidia.com/assets/ gamedev/files/sdk/11/fxaa_whitepaper.pdf 19

Preventing aliasing in texture reads Antialiasing technique: adaptive analytic prefiltering. The precision with which an edge is rendered to the screen is dynamically refined based on the rate at which the function defining the edge is changing with respect to the surrounding pixels on the screen. This is supported in GLSL by the methods dfdx(f) and dfdy(f). These methods return the derivative with respect to X and Y, in screen space, of some variable F. These are commonly used in choosing the filter width for antialiasing procedural textures. (A) Jagged lines visible in the box function of the procedural stripe texture (B) Fixed-width averaging blends adjacent samples in texture space; aliasing still occurs at the top, where adjacency in texture space does not align with adjacency in pixel space. (C) Adaptive analytic prefiltering smoothly samples both areas. Image source: Figure 17.4, p. 440, OpenGL Shading Language, Second Edition, Randi Rost, Addison Wesley, 2006. Digital image scanned by Google Books. Original image by Bert Freudenberg, University of Magdeburg, 2002. 20

Antialiasing texture reads with Signed Distance Fields Conventional anti-aliasing in texture reads can only smooth pixels immediately adjacent to the source values. Signed distance fields represent monochrome texture data as a distance map instead of as pixels. This allows per-pixel smoothing at multiple distances. 21

Antialiasing texture reads with Signed Distance Fields The bitmap becomes a height map. Each pixel stores the distance to the closest black pixel (if white) or white pixel (if black). Distance from white is negative. 3.6 2.8 2 1-1 3.1 2.2 1.4 1-1 2.8 2 1-1 -1.4 2.2 1.4 1-1 -2 2 1-1 -1.4-2.2 2 1-1 -2-2.8 Conventional antialiasing Signed distance field 22

Antialiasing texture reads with Signed Distance Fields Conventional bilinear filtering computes a weighted average of color, but an SDF computes a weighted average of distances. This means that a small step away from the original values we find smoother, straighter lines where the slope of the isocline is perpendicular to the slope of the source data. By smoothing the isocline of the distance threshold, we achieve smoother edges and nifty edge effects. low = 0.02; high = 0.035; double dist = bilinearsample(tex coords); double t = (dist - low) / (high - low); return (dist < low)? BLACK : (dist > high)? WHITE : BLACK*(1 - t) + WHITE*(t); Adding a second isocline enables colored borders. 23

Tessellation shaders One use of tessellation is in rendering geometry such as game models or terrain with view-dependent Levels of Detail ( LOD ). Another is to do with geometry what ray-tracing did with bump-mapping: high-precision realtime geometric deformation. Note how triangles are small and detailed close to the camera, but become very large and coarse in the distance. Tesselation is a new shader type introduced in OpenGL 4.x. Tesselation shaders generate new vertices within patches, transforming a small number of vertices describing triangles or quads into a large number of vertices which can be positioned individually. Florian Boesch s LOD terrain demo http://codeflow.org/entries/2010/nov/07/opengl-4-tessellation/ jabtunes.com s WebGL tessellation demo 24

Tessellation shaders How it works: You tell OpenGL how many vertices a single patch will have: Vertex shader Tessellation Control Shader glpatchparameteri(gl_patch_vertices, 4); You tell OpenGL to render your patches: gldrawarrays(gl_patches, first, numverts); The Tessellation Control Shader specifies output parameters defining how a patch is split up: gl_tesslevelouter[] and gl_tesslevelinner[]. These control the number of vertices per primitive edge and the number of nested inner levels, respectively. Tessellation primitive generator Tessellation Evaluation Shader Geometry shader Fragment shader... 25

Tessellation shaders Triangles are indexed similarly, but only use the first three Outer and the first Inner array field. gl_tesslevelinner[0] = 3.0 gl_tesslevelouter[0] = 2.0 Each field is an array. Within the array, each value sets the number of intervals to generate during subprimitive generation. gl_tesslevelouter[3] = 5.0 gl_tesslevelinner[1] = 4.0 gl_tesslevelouter[1] = 3.0 26 gl_tesslevelouter[2] = 2.0 The tessellation primitive generator generates new vertices along the outer edge and inside the patch, as specified by gl_tesslevelouter[] and gl_tesslevelinner[].

Tessellation shaders The generated vertices are then passed to the Tesselation Evaluation Shader, which can update vertex position, color, normal, and all other per-vertex data. Ultimately the complete set of new vertices is passed to the geometry and fragment shaders. Image credit: Philip Rideout 27 http://prideout.net/blog/?p=48

Redux: http://www.youtube.com/watch?v=fkk933kk6gg CPU vs GPU an object demonstration NVIDIA: Mythbusters - CPU vs GPU https://www.youtube.com/watch?v=-p28lkwtzri 28

Recommended reading Course source code on Github -- many demos (https://github.com/alexbenton/advancedgraphics) The OpenGL Programming Guide (2013), by Shreiner, Sellers, Kessenich and Licea-Kane Some also favor The OpenGL Superbible for code samples and demos There s also an OpenGL-ES reference, same series OpenGL Insights (2012), by Cozzi and Riccio OpenGL Shading Language (2009), by Rost, Licea-Kane, Ginsburg et al The Graphics Gems series from Glassner Anti-Aliasing: https://people.csail.mit.edu/ericchan/articles/prefilter/ https://developer.download.nvidia.com/assets/gamedev/files/sdk/11/fxaa_whitepaper.pdf http://iryoku.com/aacourse/downloads/09-fxaa-3.11-in-15-slides.pdf 29